2025-11-13 08:31:58,735 [ 98472 ] INFO : ClickHouse root is not set. Will use /home/ubuntu/_work/ClickHouse/ClickHouse (runner:53, check_args_and_update_paths) 2025-11-13 08:31:58,735 [ 98472 ] INFO : Cases dir is not set. Will use /home/ubuntu/_work/ClickHouse/ClickHouse/tests/integration (runner:79, check_args_and_update_paths) 2025-11-13 08:31:58,735 [ 98472 ] INFO : utils dir is not set. Will use /home/ubuntu/_work/ClickHouse/ClickHouse/utils (runner:90, check_args_and_update_paths) 2025-11-13 08:31:58,735 [ 98472 ] INFO : base_configs_dir: /home/ubuntu/_work/ClickHouse/ClickHouse/programs/server, binary: /home/ubuntu/_work/_temp/test/build/clickhouse, cases_dir: /home/ubuntu/_work/ClickHouse/ClickHouse/tests/integration (runner:92, check_args_and_update_paths) clickhouse_integration_tests_volume Running pytest container as: 'docker run --rm --name clickhouse_integration_tests_4t2s3x --privileged --dns-search='.' --memory=30709026816 --security-opt seccomp=unconfined --cap-add=SYS_PTRACE --volume=/home/ubuntu/_work/_temp/test/build/clickhouse:/clickhouse --volume=/home/ubuntu/_work/ClickHouse/ClickHouse/programs/server:/clickhouse-config --volume=/home/ubuntu/_work/ClickHouse/ClickHouse/tests/integration:/ClickHouse/tests/integration --volume=/home/ubuntu/_work/ClickHouse/ClickHouse/utils/backupview:/ClickHouse/utils/backupview --volume=/home/ubuntu/_work/ClickHouse/ClickHouse/utils/grpc-client/pb2:/ClickHouse/utils/grpc-client/pb2 --volume=/run:/run/host:ro --volume=clickhouse_integration_tests_volume:/var/lib/docker -e DOCKER_DOTNET_CLIENT_TAG=11de0b29a15d -e DOCKER_HELPER_TAG=5dc43a6382f0 -e DOCKER_BASE_TAG=5ccda723c1fc -e DOCKER_KERBEROS_KDC_TAG=9391ecdee8d7 -e DOCKER_MYSQL_GOLANG_CLIENT_TAG=9bec2a638e6e -e DOCKER_MYSQL_JAVA_CLIENT_TAG=766bff31cfe4 -e DOCKER_MYSQL_JS_CLIENT_TAG=41ba7c2ec2a1 -e DOCKER_MYSQL_PHP_CLIENT_TAG=88be89c1e3b6 -e DOCKER_NGINX_DAV_TAG=b55ac9cd7519 -e DOCKER_POSTGRESQL_JAVA_CLIENT_TAG=a4eff5c7f4d6 -e DOCKER_PYTHON_BOTTLE_TAG=d862517635bf -e DOCKER_CLIENT_TIMEOUT=300 -e COMPOSE_HTTP_TIMEOUT=600 -e PYTHONUNBUFFERED=1 -e PYTEST_ADDOPTS="--dist=loadfile -n 10 -rfEps --run-id=1 --color=no --durations=0 --report-log=parallel0_1.jsonl --report-log-exclude-logs-on-passed-tests test_asynchronous_metric_jemalloc_profile_active/test.py::test_asynchronous_metric_jemalloc_profile_active test_backup_restore_on_cluster/test_different_versions.py::test_different_versions test_database_delta/test.py::test_complex_table_schema test_database_delta/test.py::test_embedded_database_and_tables test_database_delta/test.py::test_multiple_schemes_tables -vvv " altinityinfra/integration-tests-runner:226bfaf75ac1 '. Start tests ============================= test session starts ============================== platform linux -- Python 3.10.12, pytest-7.4.4, pluggy-1.5.0 -- /usr/bin/python3 cachedir: .pytest_cache Test order randomisation NOT enabled. Enable with --random-order or --random-order-bucket= rootdir: /ClickHouse/tests/integration configfile: pytest.ini plugins: timeout-2.3.1, repeat-0.9.3, order-1.0.0, reportlog-0.4.0, xdist-3.5.0, random-order-1.1.1 timeout: 900.0s timeout method: signal timeout func_only: False created: 10/10 workers 10 workers [5 items] scheduling tests via LoadFileScheduling test_database_delta/test.py::test_complex_table_schema test_asynchronous_metric_jemalloc_profile_active/test.py::test_asynchronous_metric_jemalloc_profile_active test_backup_restore_on_cluster/test_different_versions.py::test_different_versions [gw0] [ 20%] SKIPPED test_asynchronous_metric_jemalloc_profile_active/test.py::test_asynchronous_metric_jemalloc_profile_active [gw4] [ 40%] FAILED test_backup_restore_on_cluster/test_different_versions.py::test_different_versions [gw2] [ 60%] FAILED test_database_delta/test.py::test_complex_table_schema test_database_delta/test.py::test_embedded_database_and_tables [gw2] [ 80%] FAILED test_database_delta/test.py::test_embedded_database_and_tables test_database_delta/test.py::test_multiple_schemes_tables [gw2] [100%] FAILED test_database_delta/test.py::test_multiple_schemes_tables =================================== FAILURES =================================== ___________________________ test_different_versions ____________________________ [gw4] linux -- Python 3.10.12 /usr/bin/python3 def test_different_versions(): new_node.query( "CREATE TABLE tbl" " ON CLUSTER 'cluster_ver'" " (x UInt64) ENGINE=ReplicatedMergeTree('/clickhouse/tables/tbl/', '{replica}')" " ORDER BY tuple()" ) new_node.query(f"INSERT INTO tbl VALUES (1)") old_node.query(f"INSERT INTO tbl VALUES (2)") backup_name = new_backup_name() initiator = random_node() print(f"Using {get_node_name(initiator)} as initiator for BACKUP") initiator.query(f"BACKUP TABLE tbl ON CLUSTER 'cluster_ver' TO {backup_name}") new_node.query("DROP TABLE tbl ON CLUSTER 'cluster_ver' SYNC") initiator = random_node() print(f"Using {get_node_name(initiator)} as initiator for RESTORE") initiator.query(f"RESTORE TABLE tbl ON CLUSTER 'cluster_ver' FROM {backup_name}") new_node.query("SYSTEM SYNC REPLICA ON CLUSTER 'cluster_ver' tbl") assert new_node.query("SELECT * FROM tbl ORDER BY x") == TSV([1, 2]) assert old_node.query("SELECT * FROM tbl ORDER BY x") == TSV([1, 2]) # Error NO_ELEMENTS_IN_CONFIG is unrelated. > assert ( new_node.query( "SELECT name, last_error_message FROM system.errors WHERE NOT (" "(name == 'NO_ELEMENTS_IN_CONFIG')" ")" ) == "" ) E assert "NETLINK_ERROR\tCan\\'t receive Netlink response: error -2\n" == '' E + NETLINK_ERROR Can\'t receive Netlink response: error -2 test_backup_restore_on_cluster/test_different_versions.py:105: AssertionError ---------------------------- Captured stdout setup ----------------------------- Copy common default production configuration from /clickhouse-config. Files: config.xml, users.xml Copy common default production configuration from /clickhouse-config. Files: config.xml, users.xml ------------------------------ Captured log setup ------------------------------ 2025-11-13 08:32:05.458000 [ 645 ] DEBUG : Command:[docker ps | wc -l] (cluster.py:121, run_and_check) 2025-11-13 08:32:05.482000 [ 645 ] DEBUG : Stdout:1 (cluster.py:145, run_and_check) 2025-11-13 08:32:05.482000 [ 645 ] DEBUG : No running containers (conftest.py:95, cleanup_environment) 2025-11-13 08:32:05.482000 [ 645 ] DEBUG : Pruning Docker networks (conftest.py:97, cleanup_environment) 2025-11-13 08:32:05.482000 [ 645 ] DEBUG : Command:[docker network prune --force] (cluster.py:121, run_and_check) 2025-11-13 08:32:05.502000 [ 645 ] DEBUG : Command:[sysctl net.ipv4.ip_local_port_range='55000 65535'] (cluster.py:121, run_and_check) 2025-11-13 08:32:05.505000 [ 645 ] DEBUG : Stdout:net.ipv4.ip_local_port_range = 55000 65535 (cluster.py:145, run_and_check) 2025-11-13 08:32:05.506000 [ 645 ] INFO : Running tests in /ClickHouse/tests/integration/test_backup_restore_on_cluster/test_different_versions.py (cluster.py:2738, start) 2025-11-13 08:32:05.506000 [ 645 ] DEBUG : Cluster start called. is_up=False (cluster.py:2745, start) 2025-11-13 08:32:05.529000 [ 645 ] DEBUG : Docker networks for project roottestbackuprestoreonclusterdifferentversions-gw4 are NETWORK ID NAME DRIVER SCOPE (cluster.py:830, print_all_docker_pieces) 2025-11-13 08:32:05.549000 [ 645 ] DEBUG : Docker containers for project roottestbackuprestoreonclusterdifferentversions-gw4 are CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES (cluster.py:838, print_all_docker_pieces) 2025-11-13 08:32:05.571000 [ 645 ] DEBUG : Docker volumes for project roottestbackuprestoreonclusterdifferentversions-gw4 are DRIVER VOLUME NAME (cluster.py:846, print_all_docker_pieces) 2025-11-13 08:32:05.572000 [ 645 ] DEBUG : Cleanup called (cluster.py:851, cleanup) 2025-11-13 08:32:05.597000 [ 645 ] DEBUG : Docker networks for project roottestbackuprestoreonclusterdifferentversions-gw4 are NETWORK ID NAME DRIVER SCOPE (cluster.py:830, print_all_docker_pieces) 2025-11-13 08:32:05.621000 [ 645 ] DEBUG : Docker containers for project roottestbackuprestoreonclusterdifferentversions-gw4 are CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES (cluster.py:838, print_all_docker_pieces) 2025-11-13 08:32:05.644000 [ 645 ] DEBUG : Docker volumes for project roottestbackuprestoreonclusterdifferentversions-gw4 are DRIVER VOLUME NAME (cluster.py:846, print_all_docker_pieces) 2025-11-13 08:32:05.644000 [ 645 ] DEBUG : Command:[docker container list --all --filter name='^/roottestbackuprestoreonclusterdifferentversions-gw4-.*-1$' --format '{{.ID}}:{{.Names}}'] (cluster.py:121, run_and_check) 2025-11-13 08:32:05.667000 [ 645 ] DEBUG : Unstopped containers: {} (cluster.py:865, cleanup) 2025-11-13 08:32:05.667000 [ 645 ] DEBUG : No running containers for project: roottestbackuprestoreonclusterdifferentversions-gw4 (cluster.py:879, cleanup) 2025-11-13 08:32:05.667000 [ 645 ] DEBUG : Trying to prune unused networks... (cluster.py:885, cleanup) 2025-11-13 08:32:05.690000 [ 645 ] DEBUG : Trying to prune unused images... (cluster.py:901, cleanup) 2025-11-13 08:32:05.691000 [ 645 ] DEBUG : Command:[docker image prune -f] (cluster.py:121, run_and_check) 2025-11-13 08:32:05.725000 [ 645 ] DEBUG : Stdout:Total reclaimed space: 0B (cluster.py:145, run_and_check) 2025-11-13 08:32:05.725000 [ 645 ] DEBUG : Images pruned (cluster.py:904, cleanup) 2025-11-13 08:32:05.725000 [ 645 ] DEBUG : Trying to prune unused volumes... (cluster.py:910, cleanup) 2025-11-13 08:32:05.725000 [ 645 ] DEBUG : Command:[docker volume ls | wc -l] (cluster.py:121, run_and_check) 2025-11-13 08:32:05.750000 [ 645 ] DEBUG : Stdout:1 (cluster.py:145, run_and_check) 2025-11-13 08:32:05.751000 [ 645 ] DEBUG : Volumes pruned: 1 (cluster.py:915, cleanup) 2025-11-13 08:32:05.751000 [ 645 ] DEBUG : Setup directory for instance: new_node (cluster.py:2758, start) 2025-11-13 08:32:05.752000 [ 645 ] DEBUG : Create directory for configuration generated in this helper (cluster.py:4628, create_dir) 2025-11-13 08:32:05.752000 [ 645 ] DEBUG : Create directory for common tests configuration (cluster.py:4633, create_dir) 2025-11-13 08:32:05.752000 [ 645 ] DEBUG : Copy common configuration from helpers (cluster.py:4653, create_dir) 2025-11-13 08:32:05.753000 [ 645 ] DEBUG : Generate and write macros file (cluster.py:4705, create_dir) 2025-11-13 08:32:05.753000 [ 645 ] DEBUG : Copy custom test config files ['/ClickHouse/tests/integration/test_backup_restore_on_cluster/configs/backups_disk.xml', '/ClickHouse/tests/integration/test_backup_restore_on_cluster/configs/cluster_different_versions.xml'] to /ClickHouse/tests/integration/test_backup_restore_on_cluster/_instances-different_versions-1-gw4/new_node/configs/config.d (cluster.py:4741, create_dir) 2025-11-13 08:32:05.754000 [ 645 ] DEBUG : Setup database dir /ClickHouse/tests/integration/test_backup_restore_on_cluster/_instances-different_versions-1-gw4/new_node/database (cluster.py:4758, create_dir) 2025-11-13 08:32:05.754000 [ 645 ] DEBUG : Setup logs dir /ClickHouse/tests/integration/test_backup_restore_on_cluster/_instances-different_versions-1-gw4/new_node/logs (cluster.py:4769, create_dir) 2025-11-13 08:32:05.754000 [ 645 ] DEBUG : Entrypoint cmd: ["clickhouse", "server", "--config-file=/etc/clickhouse-server/config.xml", "--log-file=/var/log/clickhouse-server/clickhouse-server.log", "--errorlog-file=/var/log/clickhouse-server/clickhouse-server.err.log", "--"] (cluster.py:4850, create_dir) 2025-11-13 08:32:05.754000 [ 645 ] INFO : external_dir_abs_path=/ClickHouse/tests/integration/test_backup_restore_on_cluster/_instances-different_versions-1-gw4/backups (cluster.py:4879, create_dir) 2025-11-13 08:32:05.755000 [ 645 ] DEBUG : Setup directory for instance: old_node (cluster.py:2758, start) 2025-11-13 08:32:05.755000 [ 645 ] DEBUG : Create directory for configuration generated in this helper (cluster.py:4628, create_dir) 2025-11-13 08:32:05.755000 [ 645 ] DEBUG : Create directory for common tests configuration (cluster.py:4633, create_dir) 2025-11-13 08:32:05.755000 [ 645 ] DEBUG : Copy common configuration from helpers (cluster.py:4653, create_dir) 2025-11-13 08:32:05.756000 [ 645 ] DEBUG : Generate and write macros file (cluster.py:4705, create_dir) 2025-11-13 08:32:05.756000 [ 645 ] DEBUG : Copy custom test config files ['/ClickHouse/tests/integration/test_backup_restore_on_cluster/configs/backups_disk.xml', '/ClickHouse/tests/integration/test_backup_restore_on_cluster/configs/cluster_different_versions.xml'] to /ClickHouse/tests/integration/test_backup_restore_on_cluster/_instances-different_versions-1-gw4/old_node/configs/config.d (cluster.py:4741, create_dir) 2025-11-13 08:32:05.757000 [ 645 ] DEBUG : Setup database dir /ClickHouse/tests/integration/test_backup_restore_on_cluster/_instances-different_versions-1-gw4/old_node/database (cluster.py:4758, create_dir) 2025-11-13 08:32:05.757000 [ 645 ] DEBUG : Setup logs dir /ClickHouse/tests/integration/test_backup_restore_on_cluster/_instances-different_versions-1-gw4/old_node/logs (cluster.py:4769, create_dir) 2025-11-13 08:32:05.757000 [ 645 ] DEBUG : Entrypoint cmd: ["clickhouse", "server", "--config-file=/etc/clickhouse-server/config.xml", "--log-file=/var/log/clickhouse-server/clickhouse-server.log", "--errorlog-file=/var/log/clickhouse-server/clickhouse-server.err.log", "--"] (cluster.py:4850, create_dir) 2025-11-13 08:32:05.758000 [ 645 ] INFO : external_dir_abs_path=/ClickHouse/tests/integration/test_backup_restore_on_cluster/_instances-different_versions-1-gw4/backups (cluster.py:4879, create_dir) 2025-11-13 08:32:05.758000 [ 645 ] DEBUG : Env {'ASAN_OPTIONS': 'use_sigaltstack=0', 'TSAN_OPTIONS': 'use_sigaltstack=0', 'CLICKHOUSE_WATCHDOG_ENABLE': '0', 'CLICKHOUSE_NATS_TLS_SECURE': '0', 'LLVM_PROFILE_FILE': '/var/lib/clickhouse/server_%h_%p_%m.profraw', 'keeper_binary': '/clickhouse', 'keeper_cmd_prefix': 'clickhouse keeper', 'image': 'altinityinfra/integration-test:5ccda723c1fc', 'user': '0', 'keeper_fs': 'bind', 'keeper_logs_dir1': '/ClickHouse/tests/integration/test_backup_restore_on_cluster/_instances-different_versions-1-gw4/keeper1/log', 'keeper_config_dir1': '/ClickHouse/tests/integration/test_backup_restore_on_cluster/_instances-different_versions-1-gw4/keeper1/config', 'keeper_db_dir1': '/ClickHouse/tests/integration/test_backup_restore_on_cluster/_instances-different_versions-1-gw4/keeper1/coordination', 'keeper_logs_dir2': '/ClickHouse/tests/integration/test_backup_restore_on_cluster/_instances-different_versions-1-gw4/keeper2/log', 'keeper_config_dir2': '/ClickHouse/tests/integration/test_backup_restore_on_cluster/_instances-different_versions-1-gw4/keeper2/config', 'keeper_db_dir2': '/ClickHouse/tests/integration/test_backup_restore_on_cluster/_instances-different_versions-1-gw4/keeper2/coordination', 'keeper_logs_dir3': '/ClickHouse/tests/integration/test_backup_restore_on_cluster/_instances-different_versions-1-gw4/keeper3/log', 'keeper_config_dir3': '/ClickHouse/tests/integration/test_backup_restore_on_cluster/_instances-different_versions-1-gw4/keeper3/config', 'keeper_db_dir3': '/ClickHouse/tests/integration/test_backup_restore_on_cluster/_instances-different_versions-1-gw4/keeper3/coordination'} stored in /ClickHouse/tests/integration/test_backup_restore_on_cluster/_instances-different_versions-1-gw4/.env (cluster.py:96, _create_env_file) 2025-11-13 08:32:05.758000 [ 645 ] DEBUG : Trying paths: ['/root/.docker/config.json', '/root/.dockercfg'] (config.py:21, find_config_file) 2025-11-13 08:32:05.758000 [ 645 ] DEBUG : No config file found (config.py:28, find_config_file) 2025-11-13 08:32:05.759000 [ 645 ] DEBUG : Trying paths: ['/root/.docker/config.json', '/root/.dockercfg'] (config.py:21, find_config_file) 2025-11-13 08:32:05.759000 [ 645 ] DEBUG : No config file found (config.py:28, find_config_file) 2025-11-13 08:32:05.771000 [ 645 ] DEBUG : http://localhost:None "GET /version HTTP/1.1" 200 826 (connectionpool.py:547, _make_request) 2025-11-13 08:32:05.772000 [ 645 ] DEBUG : Command:[docker compose --env-file /ClickHouse/tests/integration/test_backup_restore_on_cluster/_instances-different_versions-1-gw4/.env --project-name roottestbackuprestoreonclusterdifferentversions-gw4 --file /ClickHouse/tests/integration/test_backup_restore_on_cluster/_instances-different_versions-1-gw4/new_node/docker-compose.yml --file /ClickHouse/tests/integration/helpers/../../../tests/integration/compose/docker_compose_keeper.yml --file /ClickHouse/tests/integration/test_backup_restore_on_cluster/_instances-different_versions-1-gw4/old_node/docker-compose.yml pull] (cluster.py:121, run_and_check) 2025-11-13 08:32:16.240000 [ 645 ] DEBUG : Stderr: zoo2 Skipped - Image is already being pulled by zoo1 (cluster.py:147, run_and_check) 2025-11-13 08:32:16.240000 [ 645 ] DEBUG : Stderr: zoo3 Skipped - Image is already being pulled by zoo1 (cluster.py:147, run_and_check) 2025-11-13 08:32:16.240000 [ 645 ] DEBUG : Stderr: new_node Skipped - Image is already being pulled by zoo1 (cluster.py:147, run_and_check) 2025-11-13 08:32:16.240000 [ 645 ] DEBUG : Stderr: old_node Pulling (cluster.py:147, run_and_check) 2025-11-13 08:32:16.241000 [ 645 ] DEBUG : Stderr: zoo1 Pulling (cluster.py:147, run_and_check) 2025-11-13 08:32:16.241000 [ 645 ] DEBUG : Stderr: zoo1 Pulled (cluster.py:147, run_and_check) 2025-11-13 08:32:16.241000 [ 645 ] DEBUG : Stderr: old_node Pulled (cluster.py:147, run_and_check) 2025-11-13 08:32:16.241000 [ 645 ] DEBUG : Setup ZooKeeper (cluster.py:2799, start) 2025-11-13 08:32:16.241000 [ 645 ] DEBUG : Creating internal ZooKeeper dirs: ['/ClickHouse/tests/integration/test_backup_restore_on_cluster/_instances-different_versions-1-gw4/keeper1/log', '/ClickHouse/tests/integration/test_backup_restore_on_cluster/_instances-different_versions-1-gw4/keeper1/config', '/ClickHouse/tests/integration/test_backup_restore_on_cluster/_instances-different_versions-1-gw4/keeper1/coordination', '/ClickHouse/tests/integration/test_backup_restore_on_cluster/_instances-different_versions-1-gw4/keeper2/log', '/ClickHouse/tests/integration/test_backup_restore_on_cluster/_instances-different_versions-1-gw4/keeper2/config', '/ClickHouse/tests/integration/test_backup_restore_on_cluster/_instances-different_versions-1-gw4/keeper2/coordination', '/ClickHouse/tests/integration/test_backup_restore_on_cluster/_instances-different_versions-1-gw4/keeper3/log', '/ClickHouse/tests/integration/test_backup_restore_on_cluster/_instances-different_versions-1-gw4/keeper3/config', '/ClickHouse/tests/integration/test_backup_restore_on_cluster/_instances-different_versions-1-gw4/keeper3/coordination'] (cluster.py:2800, start) 2025-11-13 08:32:16.243000 [ 645 ] DEBUG : Command:[docker compose --project-name roottestbackuprestoreonclusterdifferentversions-gw4 --env-file /ClickHouse/tests/integration/test_backup_restore_on_cluster/_instances-different_versions-1-gw4/.env --file /ClickHouse/tests/integration/helpers/../../../tests/integration/compose/docker_compose_keeper.yml --verbose up -d] (cluster.py:121, run_and_check) 2025-11-13 08:32:17.338000 [ 645 ] DEBUG : Stderr:time="2025-11-13T08:32:16Z" level=trace msg="Docker Desktop integration not enabled" (cluster.py:147, run_and_check) 2025-11-13 08:32:17.338000 [ 645 ] DEBUG : Stderr: Network roottestbackuprestoreonclusterdifferentversions-gw4_default Creating (cluster.py:147, run_and_check) 2025-11-13 08:32:17.338000 [ 645 ] DEBUG : Stderr: Network roottestbackuprestoreonclusterdifferentversions-gw4_default Created (cluster.py:147, run_and_check) 2025-11-13 08:32:17.338000 [ 645 ] DEBUG : Stderr: Container roottestbackuprestoreonclusterdifferentversions-gw4-zoo2-1 Creating (cluster.py:147, run_and_check) 2025-11-13 08:32:17.338000 [ 645 ] DEBUG : Stderr: Container roottestbackuprestoreonclusterdifferentversions-gw4-zoo3-1 Creating (cluster.py:147, run_and_check) 2025-11-13 08:32:17.338000 [ 645 ] DEBUG : Stderr: Container roottestbackuprestoreonclusterdifferentversions-gw4-zoo1-1 Creating (cluster.py:147, run_and_check) 2025-11-13 08:32:17.338000 [ 645 ] DEBUG : Stderr: Container roottestbackuprestoreonclusterdifferentversions-gw4-zoo3-1 Created (cluster.py:147, run_and_check) 2025-11-13 08:32:17.339000 [ 645 ] DEBUG : Stderr: Container roottestbackuprestoreonclusterdifferentversions-gw4-zoo1-1 Created (cluster.py:147, run_and_check) 2025-11-13 08:32:17.339000 [ 645 ] DEBUG : Stderr: Container roottestbackuprestoreonclusterdifferentversions-gw4-zoo2-1 Created (cluster.py:147, run_and_check) 2025-11-13 08:32:17.339000 [ 645 ] DEBUG : Stderr: Container roottestbackuprestoreonclusterdifferentversions-gw4-zoo2-1 Starting (cluster.py:147, run_and_check) 2025-11-13 08:32:17.339000 [ 645 ] DEBUG : Stderr: Container roottestbackuprestoreonclusterdifferentversions-gw4-zoo3-1 Starting (cluster.py:147, run_and_check) 2025-11-13 08:32:17.339000 [ 645 ] DEBUG : Stderr: Container roottestbackuprestoreonclusterdifferentversions-gw4-zoo1-1 Starting (cluster.py:147, run_and_check) 2025-11-13 08:32:17.339000 [ 645 ] DEBUG : Stderr: Container roottestbackuprestoreonclusterdifferentversions-gw4-zoo2-1 Started (cluster.py:147, run_and_check) 2025-11-13 08:32:17.339000 [ 645 ] DEBUG : Stderr: Container roottestbackuprestoreonclusterdifferentversions-gw4-zoo1-1 Started (cluster.py:147, run_and_check) 2025-11-13 08:32:17.339000 [ 645 ] DEBUG : Stderr: Container roottestbackuprestoreonclusterdifferentversions-gw4-zoo3-1 Started (cluster.py:147, run_and_check) 2025-11-13 08:32:17.339000 [ 645 ] DEBUG : Stderr:time="2025-11-13T08:32:17Z" level=debug msg="otel error" error="" (cluster.py:147, run_and_check) 2025-11-13 08:32:17.339000 [ 645 ] DEBUG : Stderr:time="2025-11-13T08:32:17Z" level=debug msg="otel error" error="" (cluster.py:147, run_and_check) 2025-11-13 08:32:17.339000 [ 645 ] DEBUG : Wait ZooKeeper to start (cluster.py:2436, wait_zookeeper_to_start) 2025-11-13 08:32:17.339000 [ 645 ] DEBUG : get_instance_ip instance_name=zoo1 (cluster.py:2005, get_instance_ip) 2025-11-13 08:32:17.342000 [ 645 ] DEBUG : http://localhost:None "GET /v1.46/containers/roottestbackuprestoreonclusterdifferentversions-gw4-zoo1-1/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-11-13 08:32:17.343000 [ 645 ] DEBUG : get_kazoo_client: zoo1, ip:172.16.3.4, port:2181, use_ssl:False (cluster.py:3312, get_kazoo_client) 2025-11-13 08:32:17.345000 [ 645 ] INFO : Connecting to 172.16.3.4(172.16.3.4):2181, use_ssl: False (connection.py:650, _connect) 2025-11-13 08:32:17.345000 [ 645 ] WARNING : Connection dropped: socket connection error: Connection refused (connection.py:622, _connect_attempt) 2025-11-13 08:32:17.474000 [ 645 ] INFO : Connecting to 172.16.3.4(172.16.3.4):2181, use_ssl: False (connection.py:650, _connect) 2025-11-13 08:32:17.474000 [ 645 ] WARNING : Connection dropped: socket connection error: Connection refused (connection.py:622, _connect_attempt) 2025-11-13 08:32:17.781000 [ 645 ] INFO : Connecting to 172.16.3.4(172.16.3.4):2181, use_ssl: False (connection.py:650, _connect) 2025-11-13 08:32:17.781000 [ 645 ] WARNING : Connection dropped: socket connection error: Connection refused (connection.py:622, _connect_attempt) 2025-11-13 08:32:18.225000 [ 645 ] INFO : Connecting to 172.16.3.4(172.16.3.4):2181, use_ssl: False (connection.py:650, _connect) 2025-11-13 08:32:18.226000 [ 645 ] WARNING : Connection dropped: socket connection error: Connection refused (connection.py:622, _connect_attempt) 2025-11-13 08:32:18.983000 [ 645 ] INFO : Connecting to 172.16.3.4(172.16.3.4):2181, use_ssl: False (connection.py:650, _connect) 2025-11-13 08:32:18.984000 [ 645 ] WARNING : Connection dropped: socket connection error: Connection refused (connection.py:622, _connect_attempt) 2025-11-13 08:32:20.309000 [ 645 ] INFO : Connecting to 172.16.3.4(172.16.3.4):2181, use_ssl: False (connection.py:650, _connect) 2025-11-13 08:32:20.309000 [ 645 ] DEBUG : Sending request(xid=None): Connect(protocol_version=0, last_zxid_seen=0, time_out=30000, session_id=0, passwd=b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00', read_only=None) (connection.py:312, _submit) 2025-11-13 08:32:20.319000 [ 645 ] INFO : Zookeeper connection established, state: CONNECTED (client.py:532, _session_callback) 2025-11-13 08:32:20.319000 [ 645 ] DEBUG : Sending request(xid=1): GetChildren(path='/', watcher=None) (connection.py:312, _submit) 2025-11-13 08:32:20.321000 [ 645 ] DEBUG : Received response(xid=1): ['keeper'] (connection.py:410, _read_response) 2025-11-13 08:32:20.321000 [ 645 ] DEBUG : Sending request(xid=2): Close() (connection.py:312, _submit) 2025-11-13 08:32:20.328000 [ 645 ] WARNING : Connection dropped: socket connection broken (connection.py:622, _connect_attempt) 2025-11-13 08:32:20.328000 [ 645 ] WARNING : Transition to CONNECTING (connection.py:626, _connect_attempt) 2025-11-13 08:32:20.328000 [ 645 ] INFO : Zookeeper connection lost (client.py:543, _session_callback) 2025-11-13 08:32:20.429000 [ 645 ] WARNING : Failed connecting to Zookeeper within the connection retry policy. (connection.py:515, zk_loop) 2025-11-13 08:32:20.429000 [ 645 ] INFO : Zookeeper session closed, state: CLOSED (client.py:537, _session_callback) 2025-11-13 08:32:20.429000 [ 645 ] DEBUG : get_instance_ip instance_name=zoo2 (cluster.py:2005, get_instance_ip) 2025-11-13 08:32:20.433000 [ 645 ] DEBUG : http://localhost:None "GET /v1.46/containers/roottestbackuprestoreonclusterdifferentversions-gw4-zoo2-1/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-11-13 08:32:20.433000 [ 645 ] DEBUG : get_kazoo_client: zoo2, ip:172.16.3.2, port:2181, use_ssl:False (cluster.py:3312, get_kazoo_client) 2025-11-13 08:32:20.435000 [ 645 ] INFO : Connecting to 172.16.3.2(172.16.3.2):2181, use_ssl: False (connection.py:650, _connect) 2025-11-13 08:32:20.436000 [ 645 ] DEBUG : Sending request(xid=None): Connect(protocol_version=0, last_zxid_seen=0, time_out=30000, session_id=0, passwd=b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00', read_only=None) (connection.py:312, _submit) 2025-11-13 08:32:20.441000 [ 645 ] INFO : Zookeeper connection established, state: CONNECTED (client.py:532, _session_callback) 2025-11-13 08:32:20.442000 [ 645 ] DEBUG : Sending request(xid=1): GetChildren(path='/', watcher=None) (connection.py:312, _submit) 2025-11-13 08:32:20.443000 [ 645 ] DEBUG : Received response(xid=1): ['keeper'] (connection.py:410, _read_response) 2025-11-13 08:32:20.443000 [ 645 ] DEBUG : Sending request(xid=2): Close() (connection.py:312, _submit) 2025-11-13 08:32:20.449000 [ 645 ] WARNING : Connection dropped: socket connection broken (connection.py:622, _connect_attempt) 2025-11-13 08:32:20.449000 [ 645 ] WARNING : Transition to CONNECTING (connection.py:626, _connect_attempt) 2025-11-13 08:32:20.449000 [ 645 ] INFO : Zookeeper connection lost (client.py:543, _session_callback) 2025-11-13 08:32:20.536000 [ 645 ] WARNING : Failed connecting to Zookeeper within the connection retry policy. (connection.py:515, zk_loop) 2025-11-13 08:32:20.536000 [ 645 ] INFO : Zookeeper session closed, state: CLOSED (client.py:537, _session_callback) 2025-11-13 08:32:20.536000 [ 645 ] DEBUG : get_instance_ip instance_name=zoo3 (cluster.py:2005, get_instance_ip) 2025-11-13 08:32:20.539000 [ 645 ] DEBUG : http://localhost:None "GET /v1.46/containers/roottestbackuprestoreonclusterdifferentversions-gw4-zoo3-1/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-11-13 08:32:20.540000 [ 645 ] DEBUG : get_kazoo_client: zoo3, ip:172.16.3.3, port:2181, use_ssl:False (cluster.py:3312, get_kazoo_client) 2025-11-13 08:32:20.542000 [ 645 ] INFO : Connecting to 172.16.3.3(172.16.3.3):2181, use_ssl: False (connection.py:650, _connect) 2025-11-13 08:32:20.542000 [ 645 ] DEBUG : Sending request(xid=None): Connect(protocol_version=0, last_zxid_seen=0, time_out=30000, session_id=0, passwd=b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00', read_only=None) (connection.py:312, _submit) 2025-11-13 08:32:20.551000 [ 645 ] INFO : Zookeeper connection established, state: CONNECTED (client.py:532, _session_callback) 2025-11-13 08:32:20.552000 [ 645 ] DEBUG : Sending request(xid=1): GetChildren(path='/', watcher=None) (connection.py:312, _submit) 2025-11-13 08:32:20.553000 [ 645 ] DEBUG : Received response(xid=1): ['keeper'] (connection.py:410, _read_response) 2025-11-13 08:32:20.553000 [ 645 ] DEBUG : Sending request(xid=2): Close() (connection.py:312, _submit) 2025-11-13 08:32:20.558000 [ 645 ] WARNING : Connection dropped: socket connection broken (connection.py:622, _connect_attempt) 2025-11-13 08:32:20.559000 [ 645 ] WARNING : Transition to CONNECTING (connection.py:626, _connect_attempt) 2025-11-13 08:32:20.559000 [ 645 ] INFO : Zookeeper connection lost (client.py:543, _session_callback) 2025-11-13 08:32:20.653000 [ 645 ] WARNING : Failed connecting to Zookeeper within the connection retry policy. (connection.py:515, zk_loop) 2025-11-13 08:32:20.654000 [ 645 ] INFO : Zookeeper session closed, state: CLOSED (client.py:537, _session_callback) 2025-11-13 08:32:20.654000 [ 645 ] DEBUG : All instances of ZooKeeper started: ('zoo1', 'zoo2', 'zoo3') (cluster.py:2452, wait_zookeeper_nodes_to_start) 2025-11-13 08:32:20.654000 [ 645 ] DEBUG : ('Trying to create ClickHouse instance by command %s', 'docker compose --env-file /ClickHouse/tests/integration/test_backup_restore_on_cluster/_instances-different_versions-1-gw4/.env --project-name roottestbackuprestoreonclusterdifferentversions-gw4 --file /ClickHouse/tests/integration/test_backup_restore_on_cluster/_instances-different_versions-1-gw4/new_node/docker-compose.yml --file /ClickHouse/tests/integration/helpers/../../../tests/integration/compose/docker_compose_keeper.yml --file /ClickHouse/tests/integration/test_backup_restore_on_cluster/_instances-different_versions-1-gw4/old_node/docker-compose.yml up -d --no-recreate') (cluster.py:3139, start) 2025-11-13 08:32:20.655000 [ 645 ] DEBUG : Command:[docker compose --env-file /ClickHouse/tests/integration/test_backup_restore_on_cluster/_instances-different_versions-1-gw4/.env --project-name roottestbackuprestoreonclusterdifferentversions-gw4 --file /ClickHouse/tests/integration/test_backup_restore_on_cluster/_instances-different_versions-1-gw4/new_node/docker-compose.yml --file /ClickHouse/tests/integration/helpers/../../../tests/integration/compose/docker_compose_keeper.yml --file /ClickHouse/tests/integration/test_backup_restore_on_cluster/_instances-different_versions-1-gw4/old_node/docker-compose.yml up -d --no-recreate] (cluster.py:121, run_and_check) 2025-11-13 08:32:21.201000 [ 645 ] DEBUG : Stderr: Container roottestbackuprestoreonclusterdifferentversions-gw4-zoo1-1 Running (cluster.py:147, run_and_check) 2025-11-13 08:32:21.201000 [ 645 ] DEBUG : Stderr: Container roottestbackuprestoreonclusterdifferentversions-gw4-zoo2-1 Running (cluster.py:147, run_and_check) 2025-11-13 08:32:21.201000 [ 645 ] DEBUG : Stderr: Container roottestbackuprestoreonclusterdifferentversions-gw4-zoo3-1 Running (cluster.py:147, run_and_check) 2025-11-13 08:32:21.201000 [ 645 ] DEBUG : Stderr: Container roottestbackuprestoreonclusterdifferentversions-gw4-old_node-1 Creating (cluster.py:147, run_and_check) 2025-11-13 08:32:21.201000 [ 645 ] DEBUG : Stderr: Container roottestbackuprestoreonclusterdifferentversions-gw4-new_node-1 Creating (cluster.py:147, run_and_check) 2025-11-13 08:32:21.202000 [ 645 ] DEBUG : Stderr: Container roottestbackuprestoreonclusterdifferentversions-gw4-old_node-1 Created (cluster.py:147, run_and_check) 2025-11-13 08:32:21.202000 [ 645 ] DEBUG : Stderr: Container roottestbackuprestoreonclusterdifferentversions-gw4-new_node-1 Created (cluster.py:147, run_and_check) 2025-11-13 08:32:21.202000 [ 645 ] DEBUG : Stderr: Container roottestbackuprestoreonclusterdifferentversions-gw4-new_node-1 Starting (cluster.py:147, run_and_check) 2025-11-13 08:32:21.202000 [ 645 ] DEBUG : Stderr: Container roottestbackuprestoreonclusterdifferentversions-gw4-old_node-1 Starting (cluster.py:147, run_and_check) 2025-11-13 08:32:21.202000 [ 645 ] DEBUG : Stderr: Container roottestbackuprestoreonclusterdifferentversions-gw4-old_node-1 Started (cluster.py:147, run_and_check) 2025-11-13 08:32:21.202000 [ 645 ] DEBUG : Stderr: Container roottestbackuprestoreonclusterdifferentversions-gw4-new_node-1 Started (cluster.py:147, run_and_check) 2025-11-13 08:32:21.202000 [ 645 ] DEBUG : ClickHouse instance created (cluster.py:3147, start) 2025-11-13 08:32:21.202000 [ 645 ] DEBUG : get_instance_ip instance_name=new_node (cluster.py:2005, get_instance_ip) 2025-11-13 08:32:21.206000 [ 645 ] DEBUG : http://localhost:None "GET /v1.46/containers/roottestbackuprestoreonclusterdifferentversions-gw4-new_node-1/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-11-13 08:32:21.206000 [ 645 ] DEBUG : get_instance_ip instance_name=new_node (cluster.py:2015, get_instance_global_ipv6) 2025-11-13 08:32:21.209000 [ 645 ] DEBUG : http://localhost:None "GET /v1.46/containers/roottestbackuprestoreonclusterdifferentversions-gw4-new_node-1/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-11-13 08:32:21.210000 [ 645 ] DEBUG : Waiting for ClickHouse start in new_node, ip: 172.16.3.6... (cluster.py:3155, start) 2025-11-13 08:32:21.212000 [ 645 ] DEBUG : http://localhost:None "GET /v1.46/containers/roottestbackuprestoreonclusterdifferentversions-gw4-new_node-1/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-11-13 08:32:21.215000 [ 645 ] DEBUG : http://localhost:None "GET /v1.46/containers/1ded1976acc229cb3f8fce74d8598b2a890dfd283e90a759abc89866b51622a3/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-11-13 08:32:21.319000 [ 645 ] DEBUG : http://localhost:None "GET /v1.46/containers/1ded1976acc229cb3f8fce74d8598b2a890dfd283e90a759abc89866b51622a3/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-11-13 08:32:21.422000 [ 645 ] DEBUG : http://localhost:None "GET /v1.46/containers/1ded1976acc229cb3f8fce74d8598b2a890dfd283e90a759abc89866b51622a3/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-11-13 08:32:21.525000 [ 645 ] DEBUG : http://localhost:None "GET /v1.46/containers/1ded1976acc229cb3f8fce74d8598b2a890dfd283e90a759abc89866b51622a3/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-11-13 08:32:21.629000 [ 645 ] DEBUG : http://localhost:None "GET /v1.46/containers/1ded1976acc229cb3f8fce74d8598b2a890dfd283e90a759abc89866b51622a3/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-11-13 08:32:21.733000 [ 645 ] DEBUG : http://localhost:None "GET /v1.46/containers/1ded1976acc229cb3f8fce74d8598b2a890dfd283e90a759abc89866b51622a3/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-11-13 08:32:21.836000 [ 645 ] DEBUG : http://localhost:None "GET /v1.46/containers/1ded1976acc229cb3f8fce74d8598b2a890dfd283e90a759abc89866b51622a3/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-11-13 08:32:21.940000 [ 645 ] DEBUG : http://localhost:None "GET /v1.46/containers/1ded1976acc229cb3f8fce74d8598b2a890dfd283e90a759abc89866b51622a3/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-11-13 08:32:22.044000 [ 645 ] DEBUG : http://localhost:None "GET /v1.46/containers/1ded1976acc229cb3f8fce74d8598b2a890dfd283e90a759abc89866b51622a3/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-11-13 08:32:22.148000 [ 645 ] DEBUG : http://localhost:None "GET /v1.46/containers/1ded1976acc229cb3f8fce74d8598b2a890dfd283e90a759abc89866b51622a3/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-11-13 08:32:22.250000 [ 645 ] DEBUG : http://localhost:None "GET /v1.46/containers/1ded1976acc229cb3f8fce74d8598b2a890dfd283e90a759abc89866b51622a3/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-11-13 08:32:22.353000 [ 645 ] DEBUG : http://localhost:None "GET /v1.46/containers/1ded1976acc229cb3f8fce74d8598b2a890dfd283e90a759abc89866b51622a3/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-11-13 08:32:22.456000 [ 645 ] DEBUG : http://localhost:None "GET /v1.46/containers/1ded1976acc229cb3f8fce74d8598b2a890dfd283e90a759abc89866b51622a3/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-11-13 08:32:22.559000 [ 645 ] DEBUG : http://localhost:None "GET /v1.46/containers/1ded1976acc229cb3f8fce74d8598b2a890dfd283e90a759abc89866b51622a3/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-11-13 08:32:22.663000 [ 645 ] DEBUG : http://localhost:None "GET /v1.46/containers/1ded1976acc229cb3f8fce74d8598b2a890dfd283e90a759abc89866b51622a3/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-11-13 08:32:22.766000 [ 645 ] DEBUG : http://localhost:None "GET /v1.46/containers/1ded1976acc229cb3f8fce74d8598b2a890dfd283e90a759abc89866b51622a3/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-11-13 08:32:22.869000 [ 645 ] DEBUG : http://localhost:None "GET /v1.46/containers/1ded1976acc229cb3f8fce74d8598b2a890dfd283e90a759abc89866b51622a3/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-11-13 08:32:22.973000 [ 645 ] DEBUG : http://localhost:None "GET /v1.46/containers/1ded1976acc229cb3f8fce74d8598b2a890dfd283e90a759abc89866b51622a3/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-11-13 08:32:23.076000 [ 645 ] DEBUG : http://localhost:None "GET /v1.46/containers/1ded1976acc229cb3f8fce74d8598b2a890dfd283e90a759abc89866b51622a3/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-11-13 08:32:23.180000 [ 645 ] DEBUG : http://localhost:None "GET /v1.46/containers/1ded1976acc229cb3f8fce74d8598b2a890dfd283e90a759abc89866b51622a3/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-11-13 08:32:23.283000 [ 645 ] DEBUG : http://localhost:None "GET /v1.46/containers/1ded1976acc229cb3f8fce74d8598b2a890dfd283e90a759abc89866b51622a3/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-11-13 08:32:23.387000 [ 645 ] DEBUG : http://localhost:None "GET /v1.46/containers/1ded1976acc229cb3f8fce74d8598b2a890dfd283e90a759abc89866b51622a3/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-11-13 08:32:23.491000 [ 645 ] DEBUG : http://localhost:None "GET /v1.46/containers/1ded1976acc229cb3f8fce74d8598b2a890dfd283e90a759abc89866b51622a3/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-11-13 08:32:23.594000 [ 645 ] DEBUG : http://localhost:None "GET /v1.46/containers/1ded1976acc229cb3f8fce74d8598b2a890dfd283e90a759abc89866b51622a3/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-11-13 08:32:23.595000 [ 645 ] DEBUG : ClickHouse new_node started (cluster.py:3159, start) 2025-11-13 08:32:23.595000 [ 645 ] DEBUG : get_instance_ip instance_name=old_node (cluster.py:2005, get_instance_ip) 2025-11-13 08:32:23.598000 [ 645 ] DEBUG : http://localhost:None "GET /v1.46/containers/roottestbackuprestoreonclusterdifferentversions-gw4-old_node-1/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-11-13 08:32:23.598000 [ 645 ] DEBUG : get_instance_ip instance_name=old_node (cluster.py:2015, get_instance_global_ipv6) 2025-11-13 08:32:23.600000 [ 645 ] DEBUG : http://localhost:None "GET /v1.46/containers/roottestbackuprestoreonclusterdifferentversions-gw4-old_node-1/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-11-13 08:32:23.601000 [ 645 ] DEBUG : Waiting for ClickHouse start in old_node, ip: 172.16.3.5... (cluster.py:3155, start) 2025-11-13 08:32:23.602000 [ 645 ] DEBUG : http://localhost:None "GET /v1.46/containers/roottestbackuprestoreonclusterdifferentversions-gw4-old_node-1/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-11-13 08:32:23.604000 [ 645 ] DEBUG : http://localhost:None "GET /v1.46/containers/e2f4f753c61fdea4bb8fd4052624a1a97c8e247518d117dc05502984ac094f9e/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-11-13 08:32:23.604000 [ 645 ] DEBUG : ClickHouse old_node started (cluster.py:3159, start) ----------------------------- Captured stdout call ----------------------------- Using old_node as initiator for BACKUP Using old_node as initiator for RESTORE ------------------------------ Captured log call ------------------------------- 2025-11-13 08:32:23.606000 [ 645 ] DEBUG : Executing query CREATE TABLE tbl ON CLUSTER 'cluster_ver' (x UInt64) ENGINE=ReplicatedMergeTree('/clickhouse/tables/tbl/', '{replica}') ORDER BY tuple() on new_node (cluster.py:3648, query) 2025-11-13 08:32:24.072000 [ 645 ] DEBUG : Executing query INSERT INTO tbl VALUES (1) on new_node (cluster.py:3648, query) 2025-11-13 08:32:24.490000 [ 645 ] DEBUG : Executing query INSERT INTO tbl VALUES (2) on old_node (cluster.py:3648, query) 2025-11-13 08:32:24.756000 [ 645 ] DEBUG : Executing query BACKUP TABLE tbl ON CLUSTER 'cluster_ver' TO Disk('backups', '1') on old_node (cluster.py:3648, query) 2025-11-13 08:32:25.373000 [ 645 ] DEBUG : Executing query DROP TABLE tbl ON CLUSTER 'cluster_ver' SYNC on new_node (cluster.py:3648, query) 2025-11-13 08:32:25.839000 [ 645 ] DEBUG : Executing query RESTORE TABLE tbl ON CLUSTER 'cluster_ver' FROM Disk('backups', '1') on old_node (cluster.py:3648, query) 2025-11-13 08:32:26.456000 [ 645 ] DEBUG : Executing query SYSTEM SYNC REPLICA ON CLUSTER 'cluster_ver' tbl on new_node (cluster.py:3648, query) 2025-11-13 08:32:26.872000 [ 645 ] DEBUG : Executing query SELECT * FROM tbl ORDER BY x on new_node (cluster.py:3648, query) 2025-11-13 08:32:27.188000 [ 645 ] DEBUG : Executing query SELECT * FROM tbl ORDER BY x on old_node (cluster.py:3648, query) 2025-11-13 08:32:27.454000 [ 645 ] DEBUG : Executing query SELECT name, last_error_message FROM system.errors WHERE NOT ((name == 'NO_ELEMENTS_IN_CONFIG')) on new_node (cluster.py:3648, query) ---------------------------- Captured log teardown ----------------------------- 2025-11-13 08:32:27.823000 [ 645 ] DEBUG : Executing query DROP TABLE IF EXISTS tbl ON CLUSTER 'cluster_ver' SYNC on new_node (cluster.py:3648, query) 2025-11-13 08:32:28.289000 [ 645 ] DEBUG : Command:[docker compose --env-file /ClickHouse/tests/integration/test_backup_restore_on_cluster/_instances-different_versions-1-gw4/.env --project-name roottestbackuprestoreonclusterdifferentversions-gw4 --file /ClickHouse/tests/integration/test_backup_restore_on_cluster/_instances-different_versions-1-gw4/new_node/docker-compose.yml --file /ClickHouse/tests/integration/helpers/../../../tests/integration/compose/docker_compose_keeper.yml --file /ClickHouse/tests/integration/test_backup_restore_on_cluster/_instances-different_versions-1-gw4/old_node/docker-compose.yml stop --timeout 20] (cluster.py:121, run_and_check) 2025-11-13 08:32:35.702000 [ 645 ] DEBUG : Stderr: Container roottestbackuprestoreonclusterdifferentversions-gw4-old_node-1 Stopping (cluster.py:147, run_and_check) 2025-11-13 08:32:35.702000 [ 645 ] DEBUG : Stderr: Container roottestbackuprestoreonclusterdifferentversions-gw4-new_node-1 Stopping (cluster.py:147, run_and_check) 2025-11-13 08:32:35.702000 [ 645 ] DEBUG : Stderr: Container roottestbackuprestoreonclusterdifferentversions-gw4-old_node-1 Stopped (cluster.py:147, run_and_check) 2025-11-13 08:32:35.703000 [ 645 ] DEBUG : Stderr: Container roottestbackuprestoreonclusterdifferentversions-gw4-new_node-1 Stopped (cluster.py:147, run_and_check) 2025-11-13 08:32:35.703000 [ 645 ] DEBUG : Stderr: Container roottestbackuprestoreonclusterdifferentversions-gw4-zoo3-1 Stopping (cluster.py:147, run_and_check) 2025-11-13 08:32:35.703000 [ 645 ] DEBUG : Stderr: Container roottestbackuprestoreonclusterdifferentversions-gw4-zoo1-1 Stopping (cluster.py:147, run_and_check) 2025-11-13 08:32:35.703000 [ 645 ] DEBUG : Stderr: Container roottestbackuprestoreonclusterdifferentversions-gw4-zoo2-1 Stopping (cluster.py:147, run_and_check) 2025-11-13 08:32:35.703000 [ 645 ] DEBUG : Stderr: Container roottestbackuprestoreonclusterdifferentversions-gw4-zoo1-1 Stopped (cluster.py:147, run_and_check) 2025-11-13 08:32:35.703000 [ 645 ] DEBUG : Stderr: Container roottestbackuprestoreonclusterdifferentversions-gw4-zoo3-1 Stopped (cluster.py:147, run_and_check) 2025-11-13 08:32:35.703000 [ 645 ] DEBUG : Stderr: Container roottestbackuprestoreonclusterdifferentversions-gw4-zoo2-1 Stopped (cluster.py:147, run_and_check) 2025-11-13 08:32:35.703000 [ 645 ] DEBUG : Command:[bash -c [ -f /ClickHouse/tests/integration/test_backup_restore_on_cluster/_instances-different_versions-1-gw4/new_node/logs/stderr.log ] && zgrep -aH "==================" /ClickHouse/tests/integration/test_backup_restore_on_cluster/_instances-different_versions-1-gw4/new_node/logs/stderr.log* | ( [ -z "" ] && cat || grep -v "$" ) || true] (cluster.py:121, run_and_check) 2025-11-13 08:32:35.717000 [ 645 ] DEBUG : Command:[bash -c [ -f /ClickHouse/tests/integration/test_backup_restore_on_cluster/_instances-different_versions-1-gw4/old_node/logs/stderr.log ] && zgrep -aH "==================" /ClickHouse/tests/integration/test_backup_restore_on_cluster/_instances-different_versions-1-gw4/old_node/logs/stderr.log* | ( [ -z "" ] && cat || grep -v "$" ) || true] (cluster.py:121, run_and_check) 2025-11-13 08:32:35.729000 [ 645 ] DEBUG : Command:[docker compose --env-file /ClickHouse/tests/integration/test_backup_restore_on_cluster/_instances-different_versions-1-gw4/.env --project-name roottestbackuprestoreonclusterdifferentversions-gw4 --file /ClickHouse/tests/integration/test_backup_restore_on_cluster/_instances-different_versions-1-gw4/new_node/docker-compose.yml --file /ClickHouse/tests/integration/helpers/../../../tests/integration/compose/docker_compose_keeper.yml --file /ClickHouse/tests/integration/test_backup_restore_on_cluster/_instances-different_versions-1-gw4/old_node/docker-compose.yml down --volumes] (cluster.py:121, run_and_check) 2025-11-13 08:32:36.261000 [ 645 ] DEBUG : Stderr: Container roottestbackuprestoreonclusterdifferentversions-gw4-new_node-1 Stopping (cluster.py:147, run_and_check) 2025-11-13 08:32:36.262000 [ 645 ] DEBUG : Stderr: Container roottestbackuprestoreonclusterdifferentversions-gw4-old_node-1 Stopping (cluster.py:147, run_and_check) 2025-11-13 08:32:36.262000 [ 645 ] DEBUG : Stderr: Container roottestbackuprestoreonclusterdifferentversions-gw4-new_node-1 Stopped (cluster.py:147, run_and_check) 2025-11-13 08:32:36.262000 [ 645 ] DEBUG : Stderr: Container roottestbackuprestoreonclusterdifferentversions-gw4-new_node-1 Removing (cluster.py:147, run_and_check) 2025-11-13 08:32:36.262000 [ 645 ] DEBUG : Stderr: Container roottestbackuprestoreonclusterdifferentversions-gw4-old_node-1 Stopped (cluster.py:147, run_and_check) 2025-11-13 08:32:36.262000 [ 645 ] DEBUG : Stderr: Container roottestbackuprestoreonclusterdifferentversions-gw4-old_node-1 Removing (cluster.py:147, run_and_check) 2025-11-13 08:32:36.262000 [ 645 ] DEBUG : Stderr: Container roottestbackuprestoreonclusterdifferentversions-gw4-new_node-1 Removed (cluster.py:147, run_and_check) 2025-11-13 08:32:36.262000 [ 645 ] DEBUG : Stderr: Container roottestbackuprestoreonclusterdifferentversions-gw4-old_node-1 Removed (cluster.py:147, run_and_check) 2025-11-13 08:32:36.262000 [ 645 ] DEBUG : Stderr: Container roottestbackuprestoreonclusterdifferentversions-gw4-zoo1-1 Stopping (cluster.py:147, run_and_check) 2025-11-13 08:32:36.262000 [ 645 ] DEBUG : Stderr: Container roottestbackuprestoreonclusterdifferentversions-gw4-zoo2-1 Stopping (cluster.py:147, run_and_check) 2025-11-13 08:32:36.262000 [ 645 ] DEBUG : Stderr: Container roottestbackuprestoreonclusterdifferentversions-gw4-zoo3-1 Stopping (cluster.py:147, run_and_check) 2025-11-13 08:32:36.262000 [ 645 ] DEBUG : Stderr: Container roottestbackuprestoreonclusterdifferentversions-gw4-zoo2-1 Stopped (cluster.py:147, run_and_check) 2025-11-13 08:32:36.262000 [ 645 ] DEBUG : Stderr: Container roottestbackuprestoreonclusterdifferentversions-gw4-zoo2-1 Removing (cluster.py:147, run_and_check) 2025-11-13 08:32:36.263000 [ 645 ] DEBUG : Stderr: Container roottestbackuprestoreonclusterdifferentversions-gw4-zoo1-1 Stopped (cluster.py:147, run_and_check) 2025-11-13 08:32:36.263000 [ 645 ] DEBUG : Stderr: Container roottestbackuprestoreonclusterdifferentversions-gw4-zoo1-1 Removing (cluster.py:147, run_and_check) 2025-11-13 08:32:36.263000 [ 645 ] DEBUG : Stderr: Container roottestbackuprestoreonclusterdifferentversions-gw4-zoo3-1 Stopped (cluster.py:147, run_and_check) 2025-11-13 08:32:36.263000 [ 645 ] DEBUG : Stderr: Container roottestbackuprestoreonclusterdifferentversions-gw4-zoo3-1 Removing (cluster.py:147, run_and_check) 2025-11-13 08:32:36.263000 [ 645 ] DEBUG : Stderr: Container roottestbackuprestoreonclusterdifferentversions-gw4-zoo1-1 Removed (cluster.py:147, run_and_check) 2025-11-13 08:32:36.263000 [ 645 ] DEBUG : Stderr: Container roottestbackuprestoreonclusterdifferentversions-gw4-zoo3-1 Removed (cluster.py:147, run_and_check) 2025-11-13 08:32:36.263000 [ 645 ] DEBUG : Stderr: Container roottestbackuprestoreonclusterdifferentversions-gw4-zoo2-1 Removed (cluster.py:147, run_and_check) 2025-11-13 08:32:36.263000 [ 645 ] DEBUG : Stderr: Network roottestbackuprestoreonclusterdifferentversions-gw4_default Removing (cluster.py:147, run_and_check) 2025-11-13 08:32:36.263000 [ 645 ] DEBUG : Stderr: Network roottestbackuprestoreonclusterdifferentversions-gw4_default Removed (cluster.py:147, run_and_check) 2025-11-13 08:32:36.263000 [ 645 ] DEBUG : Cleanup called (cluster.py:851, cleanup) 2025-11-13 08:32:36.288000 [ 645 ] DEBUG : Docker networks for project roottestbackuprestoreonclusterdifferentversions-gw4 are NETWORK ID NAME DRIVER SCOPE (cluster.py:830, print_all_docker_pieces) 2025-11-13 08:32:36.313000 [ 645 ] DEBUG : Docker containers for project roottestbackuprestoreonclusterdifferentversions-gw4 are CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES (cluster.py:838, print_all_docker_pieces) 2025-11-13 08:32:36.337000 [ 645 ] DEBUG : Docker volumes for project roottestbackuprestoreonclusterdifferentversions-gw4 are DRIVER VOLUME NAME (cluster.py:846, print_all_docker_pieces) 2025-11-13 08:32:36.337000 [ 645 ] DEBUG : Command:[docker container list --all --filter name='^/roottestbackuprestoreonclusterdifferentversions-gw4-.*-1$' --format '{{.ID}}:{{.Names}}'] (cluster.py:121, run_and_check) 2025-11-13 08:32:36.359000 [ 645 ] DEBUG : Unstopped containers: {} (cluster.py:865, cleanup) 2025-11-13 08:32:36.360000 [ 645 ] DEBUG : No running containers for project: roottestbackuprestoreonclusterdifferentversions-gw4 (cluster.py:879, cleanup) 2025-11-13 08:32:36.360000 [ 645 ] DEBUG : Trying to prune unused networks... (cluster.py:885, cleanup) 2025-11-13 08:32:36.384000 [ 645 ] DEBUG : Trying to prune unused images... (cluster.py:901, cleanup) 2025-11-13 08:32:36.385000 [ 645 ] DEBUG : Command:[docker image prune -f] (cluster.py:121, run_and_check) 2025-11-13 08:32:36.422000 [ 645 ] DEBUG : Stdout:Total reclaimed space: 0B (cluster.py:145, run_and_check) 2025-11-13 08:32:36.422000 [ 645 ] DEBUG : Images pruned (cluster.py:904, cleanup) 2025-11-13 08:32:36.422000 [ 645 ] DEBUG : Trying to prune unused volumes... (cluster.py:910, cleanup) 2025-11-13 08:32:36.422000 [ 645 ] DEBUG : Command:[docker volume ls | wc -l] (cluster.py:121, run_and_check) 2025-11-13 08:32:36.447000 [ 645 ] DEBUG : Stdout:1 (cluster.py:145, run_and_check) 2025-11-13 08:32:36.448000 [ 645 ] DEBUG : Volumes pruned: 1 (cluster.py:915, cleanup) __________________________ test_complex_table_schema ___________________________ [gw2] linux -- Python 3.10.12 /usr/bin/python3 started_cluster = def test_complex_table_schema(started_cluster): node1 = started_cluster.instances['node1'] execute_spark_query(node1, "CREATE SCHEMA schema_with_complex_tables", ignore_exit_code=True) schema = "event_date DATE, event_time TIMESTAMP, hits ARRAY, ids MAP, really_complex STRUCT" create_query = f"CREATE TABLE schema_with_complex_tables.complex_table ({schema}) using Delta location '/tmp/complex_schema/complex_table'" execute_spark_query(node1, create_query, ignore_exit_code=True) execute_spark_query(node1, "insert into schema_with_complex_tables.complex_table SELECT to_date('2024-10-01', 'yyyy-MM-dd'), to_timestamp('2024-10-01 00:12:00'), array(42, 123, 77), map(7, 'v7', 5, 'v5'), named_struct(\\\"f1\\\", 34, \\\"f2\\\", 'hello')", ignore_exit_code=True) node1.query("create database complex_schema engine DataLakeCatalog('http://localhost:8080/api/2.1/unity-catalog') settings warehouse = 'unity', catalog_type='unity', vended_credentials=false", settings={"allow_experimental_database_unity_catalog": "1"}) complex_schema_tables = list(sorted(node1.query("SHOW TABLES FROM complex_schema LIKE 'schema_with_complex_tables%'", settings={'use_hive_partitioning':'0'}).strip().split('\n'))) assert len(complex_schema_tables) == 1 > print(node1.query("SHOW CREATE TABLE complex_schema.`schema_with_complex_tables.complex_table`")) test_database_delta/test.py:125: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ helpers/cluster.py:3649: in query return self.client.query( helpers/client.py:39: in wrap return func(self, *args, **kwargs) helpers/client.py:79: in query ).get_answer() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def get_answer(self): self.process.wait(timeout=DEFAULT_QUERY_TIMEOUT) self.stdout_file.seek(0) self.stderr_file.seek(0) stdout = self.stdout_file.read().decode("utf-8", errors="replace") stderr = self.stderr_file.read().decode("utf-8", errors="replace") if ( self.timer is not None and not self.process_finished_before_timeout and not self.ignore_error ): logging.debug(f"Timed out. Last stdout:{stdout}, stderr:{stderr}") raise QueryTimeoutExceedException("Client timed out!") if ( self.process.returncode != 0 or self.remove_trash_from_stderr(stderr) ) and not self.ignore_error: > raise QueryRuntimeException( "Client failed! Return code: {}, stderr: {}".format( self.process.returncode, stderr ), self.process.returncode, stderr, ) E helpers.client.QueryRuntimeException: Client failed! Return code: 86, stderr: Received exception from server (version 25.3.8): E Code: 86. DB::Exception: Received from 172.16.1.2:9000. DB::HTTPException. DB::HTTPException: Received error from remote server http://localhost:8080/api/2.1/unity-catalog/tables/unity.schema_with_complex_tables.complex_table. HTTP status code: 404 'Not Found', body length: 182 bytes, body: '{"error_code":"NOT_FOUND","details":[{"reason":"NOT_FOUND","metadata":{},"@type":"google.rpc.ErrorInfo"}],"stack_trace":null,"message":"Schema not found: schema_with_complex_tables"}': while parsing JSON: . Stack trace: E E 0. ./contrib/llvm-project/libcxx/include/__exception/exception.h:113: Poco::Exception::Exception(String const&, int) @ 0x0000000020e10280 E 1. ./build_docker/./src/Common/Exception.cpp:108: DB::Exception::Exception(DB::Exception::MessageMasked&&, int, bool) @ 0x0000000010b395b4 E 2. DB::Exception::Exception(PreformattedMessage&&, int) @ 0x00000000087eccc0 E 3. ./src/Common/Exception.h:130: DB::Exception::Exception(int, FormatStringHelperImpl::type, std::type_identity::type, std::type_identity::type, std::type_identity::type, std::type_identity::type>, String const&, int&&, String const&, unsigned long&&, String const&) @ 0x0000000010fcd3fc E 4. ./build_docker/./src/IO/HTTPCommon.cpp:104: DB::HTTPException::HTTPException(int, String const&, Poco::Net::HTTPResponse::HTTPStatus, String const&, String const&) @ 0x0000000010fcd20e E 5. ./build_docker/./src/IO/HTTPCommon.cpp:93: DB::assertResponseIsOk(String const&, Poco::Net::HTTPResponse&, std::basic_istream>&, bool) @ 0x0000000010fccf31 E 6. ./build_docker/./src/IO/ReadWriteBufferFromHTTP.cpp:277: DB::ReadWriteBufferFromHTTP::callImpl(Poco::Net::HTTPResponse&, String const&, std::optional const&, bool) const @ 0x0000000013cfa822 E 7. ./build_docker/./src/IO/ReadWriteBufferFromHTTP.cpp:285: DB::ReadWriteBufferFromHTTP::callWithRedirects(Poco::Net::HTTPResponse&, String const&, std::optional const&) @ 0x0000000013cfa9f4 E 8. ./build_docker/./src/IO/ReadWriteBufferFromHTTP.cpp:408: DB::ReadWriteBufferFromHTTP::initialize() @ 0x0000000013cfb4a0 E 9. ./build_docker/./src/IO/ReadWriteBufferFromHTTP.cpp:472: void std::__function::__policy_invoker::__call_impl[abi:ne190107]>(std::__function::__policy_storage const*) @ 0x0000000013cfe657 E 10. ./contrib/llvm-project/libcxx/include/__functional/function.h:716: ? @ 0x0000000013cf7eb9 E 11. ./build_docker/./src/IO/ReadWriteBufferFromHTTP.cpp:465: DB::ReadWriteBufferFromHTTP::nextImpl() @ 0x0000000013cfcb61 E 12. DB::ReadBuffer::next() @ 0x000000000891fc7b E 13. ./build_docker/./src/IO/ReadWriteBufferFromHTTP.cpp:254: DB::ReadWriteBufferFromHTTP::ReadWriteBufferFromHTTP(DB::HTTPConnectionGroupType const&, Poco::URI const&, String const&, DB::ProxyConfiguration, DB::ReadSettings, DB::ConnectionTimeouts, Poco::Net::HTTPBasicCredentials const&, DB::RemoteHostFilter const*, unsigned long, unsigned long, std::function>&)>, bool, bool, std::vector>, bool, std::optional) @ 0x0000000013cfa16e E 14. ./contrib/llvm-project/libcxx/include/__memory/unique_ptr.h:634: std::__unique_if::__unique_single std::make_unique[abi:ne190107]>&)>&, bool&, bool&, std::vector>&, bool&, std::nullopt_t const&>(DB::HTTPConnectionGroupType&, Poco::URI&, String&, DB::ProxyConfiguration&, DB::ReadSettings&, DB::ConnectionTimeouts&, Poco::Net::HTTPBasicCredentials const&, DB::RemoteHostFilter const*&, unsigned long&, unsigned long&, std::function>&)>&, bool&, bool&, std::vector>&, bool&, std::nullopt_t const&) @ 0x0000000013cee6ab E 15. ./src/IO/ReadWriteBufferFromHTTP.h:248: DataLake::createReadBuffer(String const&, std::shared_ptr, Poco::Net::HTTPBasicCredentials const&, std::vector, std::allocator>> const&, std::vector> const&, String const&, std::function>&)>) @ 0x00000000178418c8 E 16. ./build_docker/./src/Databases/DataLake/HTTPBasedCatalogUtils.cpp:50: DataLake::makeHTTPRequestAndReadJSON(String const&, std::shared_ptr, Poco::Net::HTTPBasicCredentials const&, std::vector, std::allocator>> const&, std::vector> const&, String const&, std::function>&)>) @ 0x0000000017841d1d E 17. ./build_docker/./src/Databases/DataLake/UnityCatalog.cpp:55: DataLake::UnityCatalog::getJSONRequest(String const&, std::vector, std::allocator>> const&) const @ 0x0000000017835378 E 18. ./build_docker/./src/Databases/DataLake/UnityCatalog.cpp:146: DataLake::UnityCatalog::tryGetTableMetadata(String const&, String const&, DataLake::TableMetadata&) const @ 0x0000000017839bbd E 19. ./build_docker/./src/Databases/DataLake/UnityCatalog.cpp:96: DataLake::UnityCatalog::getTableMetadata(String const&, String const&, DataLake::TableMetadata&) const @ 0x0000000017839655 E 20. ./build_docker/./src/Databases/DataLake/DatabaseDataLake.cpp:489: DB::DatabaseDataLake::getCreateTableQueryImpl(String const&, std::shared_ptr, bool) const @ 0x0000000017809a52 E 21. ./src/Databases/IDatabase.h:357: DB::InterpreterShowCreateQuery::executeImpl() @ 0x0000000018e9ca85 E 22. ./build_docker/./src/Interpreters/InterpreterShowCreateQuery.cpp:34: DB::InterpreterShowCreateQuery::execute() @ 0x0000000018e9c61f E 23. ./build_docker/./src/Interpreters/executeQuery.cpp:1457: DB::executeQueryImpl(char const*, char const*, std::shared_ptr, DB::QueryFlags, DB::QueryProcessingStage::Enum, DB::ReadBuffer*, std::shared_ptr&) @ 0x0000000018e2f0ee E 24. ./build_docker/./src/Interpreters/executeQuery.cpp:1624: DB::executeQuery(String const&, std::shared_ptr, DB::QueryFlags, DB::QueryProcessingStage::Enum) @ 0x0000000018e29cef E 25. ./build_docker/./src/Server/TCPHandler.cpp:664: DB::TCPHandler::runImpl() @ 0x000000001bc02725 E 26. ./build_docker/./src/Server/TCPHandler.cpp:2629: DB::TCPHandler::run() @ 0x000000001bc29488 E 27. ./build_docker/./base/poco/Net/src/TCPServerConnection.cpp:40: Poco::Net::TCPServerConnection::start() @ 0x0000000020f1dd63 E 28. ./build_docker/./base/poco/Net/src/TCPServerDispatcher.cpp:115: Poco::Net::TCPServerDispatcher::run() @ 0x0000000020f1e5d2 E 29. ./build_docker/./base/poco/Foundation/src/ThreadPool.cpp:205: Poco::PooledThread::run() @ 0x0000000020e98883 E 30. ./build_docker/./base/poco/Foundation/src/Thread.cpp:45: Poco::(anonymous namespace)::RunnableHolder::run() @ 0x0000000020e96c30 E 31. ./base/poco/Foundation/src/Thread_POSIX.cpp:335: Poco::ThreadImpl::runnableEntry(void*) @ 0x0000000020e94fea E . (RECEIVED_ERROR_FROM_REMOTE_IO_SERVER) E (query: SHOW CREATE TABLE complex_schema.`schema_with_complex_tables.complex_table`) helpers/client.py:248: QueryRuntimeException ---------------------------- Captured stdout setup ----------------------------- Copy common default production configuration from /clickhouse-config. Files: config.xml, users.xml ------------------------------ Captured log setup ------------------------------ 2025-11-13 08:32:05.458000 [ 639 ] DEBUG : Command:[docker ps | wc -l] (cluster.py:121, run_and_check) 2025-11-13 08:32:05.482000 [ 639 ] DEBUG : Stdout:1 (cluster.py:145, run_and_check) 2025-11-13 08:32:05.482000 [ 639 ] DEBUG : No running containers (conftest.py:95, cleanup_environment) 2025-11-13 08:32:05.482000 [ 639 ] DEBUG : Pruning Docker networks (conftest.py:97, cleanup_environment) 2025-11-13 08:32:05.482000 [ 639 ] DEBUG : Command:[docker network prune --force] (cluster.py:121, run_and_check) 2025-11-13 08:32:05.510000 [ 639 ] DEBUG : Command:[sysctl net.ipv4.ip_local_port_range='55000 65535'] (cluster.py:121, run_and_check) 2025-11-13 08:32:05.512000 [ 639 ] DEBUG : Stdout:net.ipv4.ip_local_port_range = 55000 65535 (cluster.py:145, run_and_check) 2025-11-13 08:32:05.513000 [ 639 ] DEBUG : ENV DOCKER_KERBEROS_KDC_TAG 9391ecdee8d7 (cluster.py:424, __init__) 2025-11-13 08:32:05.513000 [ 639 ] DEBUG : ENV CLICKHOUSE_TESTS_SERVER_BIN_PATH /clickhouse (cluster.py:424, __init__) 2025-11-13 08:32:05.514000 [ 639 ] DEBUG : ENV MSAN_OPTIONS abort_on_error=1 poison_in_dtor=1 (cluster.py:424, __init__) 2025-11-13 08:32:05.514000 [ 639 ] DEBUG : ENV JAVA_TOOL_OPTIONS -Djdk.attach.allowAttachSelf=true (cluster.py:424, __init__) 2025-11-13 08:32:05.514000 [ 639 ] DEBUG : ENV TSAN_OPTIONS halt_on_error=1 abort_on_error=1 history_size=7 memory_limit_mb=46080 second_deadlock_stack=1 (cluster.py:424, __init__) 2025-11-13 08:32:05.514000 [ 639 ] DEBUG : ENV HOSTNAME 8d37e70e6ed2 (cluster.py:424, __init__) 2025-11-13 08:32:05.514000 [ 639 ] DEBUG : ENV SHLVL 0 (cluster.py:424, __init__) 2025-11-13 08:32:05.514000 [ 639 ] DEBUG : ENV HOME /root (cluster.py:424, __init__) 2025-11-13 08:32:05.514000 [ 639 ] DEBUG : ENV OLDPWD / (cluster.py:424, __init__) 2025-11-13 08:32:05.514000 [ 639 ] DEBUG : ENV DOCKER_HELPER_TAG 5dc43a6382f0 (cluster.py:424, __init__) 2025-11-13 08:32:05.515000 [ 639 ] DEBUG : ENV PYTHONUNBUFFERED 1 (cluster.py:424, __init__) 2025-11-13 08:32:05.515000 [ 639 ] DEBUG : ENV DOCKER_PYTHON_BOTTLE_TAG d862517635bf (cluster.py:424, __init__) 2025-11-13 08:32:05.515000 [ 639 ] DEBUG : ENV UBSAN_OPTIONS print_stacktrace=1 (cluster.py:424, __init__) 2025-11-13 08:32:05.515000 [ 639 ] DEBUG : ENV PYTEST_ADDOPTS --dist=loadfile -n 10 -rfEps --run-id=1 --color=no --durations=0 --report-log=parallel0_1.jsonl --report-log-exclude-logs-on-passed-tests test_asynchronous_metric_jemalloc_profile_active/test.py::test_asynchronous_metric_jemalloc_profile_active test_backup_restore_on_cluster/test_different_versions.py::test_different_versions test_database_delta/test.py::test_complex_table_schema test_database_delta/test.py::test_embedded_database_and_tables test_database_delta/test.py::test_multiple_schemes_tables -vvv (cluster.py:424, __init__) 2025-11-13 08:32:05.515000 [ 639 ] DEBUG : ENV COMPOSE_HTTP_TIMEOUT 600 (cluster.py:424, __init__) 2025-11-13 08:32:05.515000 [ 639 ] DEBUG : ENV DOCKER_MYSQL_PHP_CLIENT_TAG 88be89c1e3b6 (cluster.py:424, __init__) 2025-11-13 08:32:05.515000 [ 639 ] DEBUG : ENV DOCKER_DOTNET_CLIENT_TAG 11de0b29a15d (cluster.py:424, __init__) 2025-11-13 08:32:05.515000 [ 639 ] DEBUG : ENV CLICKHOUSE_TESTS_CLIENT_BIN_PATH /clickhouse (cluster.py:424, __init__) 2025-11-13 08:32:05.515000 [ 639 ] DEBUG : ENV DOCKER_MYSQL_JS_CLIENT_TAG 41ba7c2ec2a1 (cluster.py:424, __init__) 2025-11-13 08:32:05.516000 [ 639 ] DEBUG : ENV PATH /spark-3.3.2-bin-hadoop3/bin:/opt/gdb/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin (cluster.py:424, __init__) 2025-11-13 08:32:05.516000 [ 639 ] DEBUG : ENV DOCKER_KERBERIZED_HADOOP_TAG latest (cluster.py:424, __init__) 2025-11-13 08:32:05.516000 [ 639 ] DEBUG : ENV DOCKER_CHANNEL stable (cluster.py:424, __init__) 2025-11-13 08:32:05.516000 [ 639 ] DEBUG : ENV DOCKER_CLIENT_TIMEOUT 300 (cluster.py:424, __init__) 2025-11-13 08:32:05.516000 [ 639 ] DEBUG : ENV DOCKER_POSTGRESQL_JAVA_CLIENT_TAG a4eff5c7f4d6 (cluster.py:424, __init__) 2025-11-13 08:32:05.516000 [ 639 ] DEBUG : ENV DOCKER_NGINX_DAV_TAG b55ac9cd7519 (cluster.py:424, __init__) 2025-11-13 08:32:05.516000 [ 639 ] DEBUG : ENV DOCKER_MYSQL_GOLANG_CLIENT_TAG 9bec2a638e6e (cluster.py:424, __init__) 2025-11-13 08:32:05.516000 [ 639 ] DEBUG : ENV PWD /ClickHouse/tests/integration (cluster.py:424, __init__) 2025-11-13 08:32:05.516000 [ 639 ] DEBUG : ENV DOCKER_MYSQL_JAVA_CLIENT_TAG 766bff31cfe4 (cluster.py:424, __init__) 2025-11-13 08:32:05.517000 [ 639 ] DEBUG : ENV CLICKHOUSE_TESTS_BASE_CONFIG_DIR /clickhouse-config (cluster.py:424, __init__) 2025-11-13 08:32:05.517000 [ 639 ] DEBUG : ENV TZ Etc/UTC (cluster.py:424, __init__) 2025-11-13 08:32:05.517000 [ 639 ] DEBUG : ENV JAVA_PATH /usr/lib/jvm/java-11-openjdk-amd64/bin/java (cluster.py:424, __init__) 2025-11-13 08:32:05.517000 [ 639 ] DEBUG : ENV DOCKER_BASE_TAG 5ccda723c1fc (cluster.py:424, __init__) 2025-11-13 08:32:05.517000 [ 639 ] DEBUG : ENV SPARK_HOME /spark-3.3.2-bin-hadoop3 (cluster.py:424, __init__) 2025-11-13 08:32:05.517000 [ 639 ] DEBUG : ENV LC_CTYPE C.UTF-8 (cluster.py:424, __init__) 2025-11-13 08:32:05.517000 [ 639 ] DEBUG : ENV INTEGRATION_TESTS_RUN_ID 1 (cluster.py:424, __init__) 2025-11-13 08:32:05.517000 [ 639 ] DEBUG : ENV WORKER_FREE_PORTS 30100 30101 30102 30103 30104 30105 30106 30107 30108 30109 30110 30111 30112 30113 30114 30115 30116 30117 30118 30119 30120 30121 30122 30123 30124 30125 30126 30127 30128 30129 30130 30131 30132 30133 30134 30135 30136 30137 30138 30139 30140 30141 30142 30143 30144 30145 30146 30147 30148 30149 (cluster.py:424, __init__) 2025-11-13 08:32:05.517000 [ 639 ] DEBUG : ENV PYTEST_XDIST_TESTRUNUID 3ac7ff91cea7432891cb63373a9408d7 (cluster.py:424, __init__) 2025-11-13 08:32:05.517000 [ 639 ] DEBUG : ENV PYTEST_XDIST_WORKER gw2 (cluster.py:424, __init__) 2025-11-13 08:32:05.517000 [ 639 ] DEBUG : ENV PYTEST_XDIST_WORKER_COUNT 10 (cluster.py:424, __init__) 2025-11-13 08:32:05.518000 [ 639 ] DEBUG : ENV PYTEST_CURRENT_TEST test_database_delta/test.py::test_complex_table_schema (setup) (cluster.py:424, __init__) 2025-11-13 08:32:05.518000 [ 639 ] DEBUG : CLUSTER INIT base_config_dir:/clickhouse-config (cluster.py:724, __init__) 2025-11-13 08:32:05.519000 [ 639 ] DEBUG : clickhouse_start_command: clickhouse server --config-file=/etc/clickhouse-server/{main_config_file} --log-file=/var/log/clickhouse-server/clickhouse-server.log --errorlog-file=/var/log/clickhouse-server/clickhouse-server.err.log (cluster.py:1662, add_instance) 2025-11-13 08:32:05.519000 [ 639 ] DEBUG : Cluster name: project_name:roottestdatabasedelta-gw2. Added instance name:node1 tag:latest base_cmd:['docker', 'compose', '--env-file', '/ClickHouse/tests/integration/test_database_delta/_instances-1-gw2/.env', '--project-name', 'roottestdatabasedelta-gw2', '--file', '/ClickHouse/tests/integration/test_database_delta/_instances-1-gw2/node1/docker-compose.yml'] docker_compose_yml_dir:/ClickHouse/tests/integration/helpers/../../../tests/integration/compose/ (cluster.py:1948, add_instance) 2025-11-13 08:32:05.519000 [ 639 ] INFO : Starting cluster... (test.py:42, started_cluster) 2025-11-13 08:32:05.519000 [ 639 ] INFO : Running tests in /ClickHouse/tests/integration/test_database_delta/test.py (cluster.py:2738, start) 2025-11-13 08:32:05.519000 [ 639 ] DEBUG : Cluster start called. is_up=False (cluster.py:2745, start) 2025-11-13 08:32:05.543000 [ 639 ] DEBUG : Docker networks for project roottestdatabasedelta-gw2 are NETWORK ID NAME DRIVER SCOPE (cluster.py:830, print_all_docker_pieces) 2025-11-13 08:32:05.569000 [ 639 ] DEBUG : Docker containers for project roottestdatabasedelta-gw2 are CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES (cluster.py:838, print_all_docker_pieces) 2025-11-13 08:32:05.593000 [ 639 ] DEBUG : Docker volumes for project roottestdatabasedelta-gw2 are DRIVER VOLUME NAME (cluster.py:846, print_all_docker_pieces) 2025-11-13 08:32:05.593000 [ 639 ] DEBUG : Cleanup called (cluster.py:851, cleanup) 2025-11-13 08:32:05.618000 [ 639 ] DEBUG : Docker networks for project roottestdatabasedelta-gw2 are NETWORK ID NAME DRIVER SCOPE (cluster.py:830, print_all_docker_pieces) 2025-11-13 08:32:05.642000 [ 639 ] DEBUG : Docker containers for project roottestdatabasedelta-gw2 are CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES (cluster.py:838, print_all_docker_pieces) 2025-11-13 08:32:05.666000 [ 639 ] DEBUG : Docker volumes for project roottestdatabasedelta-gw2 are DRIVER VOLUME NAME (cluster.py:846, print_all_docker_pieces) 2025-11-13 08:32:05.666000 [ 639 ] DEBUG : Command:[docker container list --all --filter name='^/roottestdatabasedelta-gw2-.*-1$' --format '{{.ID}}:{{.Names}}'] (cluster.py:121, run_and_check) 2025-11-13 08:32:05.691000 [ 639 ] DEBUG : Unstopped containers: {} (cluster.py:865, cleanup) 2025-11-13 08:32:05.691000 [ 639 ] DEBUG : No running containers for project: roottestdatabasedelta-gw2 (cluster.py:879, cleanup) 2025-11-13 08:32:05.691000 [ 639 ] DEBUG : Trying to prune unused networks... (cluster.py:885, cleanup) 2025-11-13 08:32:05.716000 [ 639 ] DEBUG : Trying to prune unused images... (cluster.py:901, cleanup) 2025-11-13 08:32:05.716000 [ 639 ] DEBUG : Command:[docker image prune -f] (cluster.py:121, run_and_check) 2025-11-13 08:32:05.739000 [ 639 ] DEBUG : Stderr:Error response from daemon: a prune operation is already running (cluster.py:147, run_and_check) 2025-11-13 08:32:05.739000 [ 639 ] DEBUG : Exitcode:1 (cluster.py:149, run_and_check) 2025-11-13 08:32:05.739000 [ 639 ] DEBUG : Trying to prune unused volumes... (cluster.py:910, cleanup) 2025-11-13 08:32:05.739000 [ 639 ] DEBUG : Command:[docker volume ls | wc -l] (cluster.py:121, run_and_check) 2025-11-13 08:32:05.763000 [ 639 ] DEBUG : Stdout:1 (cluster.py:145, run_and_check) 2025-11-13 08:32:05.764000 [ 639 ] DEBUG : Volumes pruned: 1 (cluster.py:915, cleanup) 2025-11-13 08:32:05.764000 [ 639 ] DEBUG : Setup directory for instance: node1 (cluster.py:2758, start) 2025-11-13 08:32:05.765000 [ 639 ] DEBUG : Create directory for configuration generated in this helper (cluster.py:4628, create_dir) 2025-11-13 08:32:05.765000 [ 639 ] DEBUG : Create directory for common tests configuration (cluster.py:4633, create_dir) 2025-11-13 08:32:05.765000 [ 639 ] DEBUG : Copy common configuration from helpers (cluster.py:4653, create_dir) 2025-11-13 08:32:05.766000 [ 639 ] DEBUG : Generate and write macros file (cluster.py:4705, create_dir) 2025-11-13 08:32:05.766000 [ 639 ] DEBUG : Copy custom test config files [] to /ClickHouse/tests/integration/test_database_delta/_instances-1-gw2/node1/configs/config.d (cluster.py:4741, create_dir) 2025-11-13 08:32:05.766000 [ 639 ] DEBUG : Setup database dir /ClickHouse/tests/integration/test_database_delta/_instances-1-gw2/node1/database (cluster.py:4758, create_dir) 2025-11-13 08:32:05.766000 [ 639 ] DEBUG : Setup logs dir /ClickHouse/tests/integration/test_database_delta/_instances-1-gw2/node1/logs (cluster.py:4769, create_dir) 2025-11-13 08:32:05.766000 [ 639 ] DEBUG : Entrypoint cmd: ["clickhouse", "server", "--config-file=/etc/clickhouse-server/config.xml", "--log-file=/var/log/clickhouse-server/clickhouse-server.log", "--errorlog-file=/var/log/clickhouse-server/clickhouse-server.err.log", "--"] (cluster.py:4850, create_dir) 2025-11-13 08:32:05.767000 [ 639 ] DEBUG : Env {'ASAN_OPTIONS': 'use_sigaltstack=0', 'TSAN_OPTIONS': 'use_sigaltstack=0', 'CLICKHOUSE_WATCHDOG_ENABLE': '0', 'CLICKHOUSE_NATS_TLS_SECURE': '0', 'LLVM_PROFILE_FILE': '/var/lib/clickhouse/server_%h_%p_%m.profraw'} stored in /ClickHouse/tests/integration/test_database_delta/_instances-1-gw2/.env (cluster.py:96, _create_env_file) 2025-11-13 08:32:05.767000 [ 639 ] DEBUG : Trying paths: ['/root/.docker/config.json', '/root/.dockercfg'] (config.py:21, find_config_file) 2025-11-13 08:32:05.768000 [ 639 ] DEBUG : No config file found (config.py:28, find_config_file) 2025-11-13 08:32:05.768000 [ 639 ] DEBUG : Trying paths: ['/root/.docker/config.json', '/root/.dockercfg'] (config.py:21, find_config_file) 2025-11-13 08:32:05.768000 [ 639 ] DEBUG : No config file found (config.py:28, find_config_file) 2025-11-13 08:32:05.780000 [ 639 ] DEBUG : http://localhost:None "GET /version HTTP/1.1" 200 826 (connectionpool.py:547, _make_request) 2025-11-13 08:32:05.781000 [ 639 ] DEBUG : Command:[docker compose --env-file /ClickHouse/tests/integration/test_database_delta/_instances-1-gw2/.env --project-name roottestdatabasedelta-gw2 --file /ClickHouse/tests/integration/test_database_delta/_instances-1-gw2/node1/docker-compose.yml pull] (cluster.py:121, run_and_check) 2025-11-13 08:32:16.224000 [ 639 ] DEBUG : Stderr: node1 Pulling (cluster.py:147, run_and_check) 2025-11-13 08:32:16.225000 [ 639 ] DEBUG : Stderr: node1 Pulled (cluster.py:147, run_and_check) 2025-11-13 08:32:16.225000 [ 639 ] DEBUG : ('Trying to create ClickHouse instance by command %s', 'docker compose --env-file /ClickHouse/tests/integration/test_database_delta/_instances-1-gw2/.env --project-name roottestdatabasedelta-gw2 --file /ClickHouse/tests/integration/test_database_delta/_instances-1-gw2/node1/docker-compose.yml up -d --no-recreate') (cluster.py:3139, start) 2025-11-13 08:32:16.225000 [ 639 ] DEBUG : Command:[docker compose --env-file /ClickHouse/tests/integration/test_database_delta/_instances-1-gw2/.env --project-name roottestdatabasedelta-gw2 --file /ClickHouse/tests/integration/test_database_delta/_instances-1-gw2/node1/docker-compose.yml up -d --no-recreate] (cluster.py:121, run_and_check) 2025-11-13 08:32:16.804000 [ 639 ] DEBUG : Stderr: Network roottestdatabasedelta-gw2_default Creating (cluster.py:147, run_and_check) 2025-11-13 08:32:16.805000 [ 639 ] DEBUG : Stderr: Network roottestdatabasedelta-gw2_default Created (cluster.py:147, run_and_check) 2025-11-13 08:32:16.805000 [ 639 ] DEBUG : Stderr: Container roottestdatabasedelta-gw2-node1-1 Creating (cluster.py:147, run_and_check) 2025-11-13 08:32:16.805000 [ 639 ] DEBUG : Stderr: Container roottestdatabasedelta-gw2-node1-1 Created (cluster.py:147, run_and_check) 2025-11-13 08:32:16.805000 [ 639 ] DEBUG : Stderr: Container roottestdatabasedelta-gw2-node1-1 Starting (cluster.py:147, run_and_check) 2025-11-13 08:32:16.805000 [ 639 ] DEBUG : Stderr: Container roottestdatabasedelta-gw2-node1-1 Started (cluster.py:147, run_and_check) 2025-11-13 08:32:16.805000 [ 639 ] DEBUG : ClickHouse instance created (cluster.py:3147, start) 2025-11-13 08:32:16.805000 [ 639 ] DEBUG : get_instance_ip instance_name=node1 (cluster.py:2005, get_instance_ip) 2025-11-13 08:32:16.808000 [ 639 ] DEBUG : http://localhost:None "GET /v1.46/containers/roottestdatabasedelta-gw2-node1-1/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-11-13 08:32:16.809000 [ 639 ] DEBUG : get_instance_ip instance_name=node1 (cluster.py:2015, get_instance_global_ipv6) 2025-11-13 08:32:16.811000 [ 639 ] DEBUG : http://localhost:None "GET /v1.46/containers/roottestdatabasedelta-gw2-node1-1/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-11-13 08:32:16.812000 [ 639 ] DEBUG : Waiting for ClickHouse start in node1, ip: 172.16.1.2... (cluster.py:3155, start) 2025-11-13 08:32:16.817000 [ 639 ] DEBUG : http://localhost:None "GET /v1.46/containers/roottestdatabasedelta-gw2-node1-1/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-11-13 08:32:16.820000 [ 639 ] DEBUG : http://localhost:None "GET /v1.46/containers/1026393d777cfb61c4783ea6a64a60002d42176961643a4af25fd95ce70ae5f5/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-11-13 08:32:16.924000 [ 639 ] DEBUG : http://localhost:None "GET /v1.46/containers/1026393d777cfb61c4783ea6a64a60002d42176961643a4af25fd95ce70ae5f5/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-11-13 08:32:17.027000 [ 639 ] DEBUG : http://localhost:None "GET /v1.46/containers/1026393d777cfb61c4783ea6a64a60002d42176961643a4af25fd95ce70ae5f5/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-11-13 08:32:17.131000 [ 639 ] DEBUG : http://localhost:None "GET /v1.46/containers/1026393d777cfb61c4783ea6a64a60002d42176961643a4af25fd95ce70ae5f5/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-11-13 08:32:17.235000 [ 639 ] DEBUG : http://localhost:None "GET /v1.46/containers/1026393d777cfb61c4783ea6a64a60002d42176961643a4af25fd95ce70ae5f5/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-11-13 08:32:17.339000 [ 639 ] DEBUG : http://localhost:None "GET /v1.46/containers/1026393d777cfb61c4783ea6a64a60002d42176961643a4af25fd95ce70ae5f5/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-11-13 08:32:17.442000 [ 639 ] DEBUG : http://localhost:None "GET /v1.46/containers/1026393d777cfb61c4783ea6a64a60002d42176961643a4af25fd95ce70ae5f5/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-11-13 08:32:17.546000 [ 639 ] DEBUG : http://localhost:None "GET /v1.46/containers/1026393d777cfb61c4783ea6a64a60002d42176961643a4af25fd95ce70ae5f5/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-11-13 08:32:17.650000 [ 639 ] DEBUG : http://localhost:None "GET /v1.46/containers/1026393d777cfb61c4783ea6a64a60002d42176961643a4af25fd95ce70ae5f5/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-11-13 08:32:17.755000 [ 639 ] DEBUG : http://localhost:None "GET /v1.46/containers/1026393d777cfb61c4783ea6a64a60002d42176961643a4af25fd95ce70ae5f5/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-11-13 08:32:17.859000 [ 639 ] DEBUG : http://localhost:None "GET /v1.46/containers/1026393d777cfb61c4783ea6a64a60002d42176961643a4af25fd95ce70ae5f5/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-11-13 08:32:17.962000 [ 639 ] DEBUG : http://localhost:None "GET /v1.46/containers/1026393d777cfb61c4783ea6a64a60002d42176961643a4af25fd95ce70ae5f5/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-11-13 08:32:18.066000 [ 639 ] DEBUG : http://localhost:None "GET /v1.46/containers/1026393d777cfb61c4783ea6a64a60002d42176961643a4af25fd95ce70ae5f5/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-11-13 08:32:18.169000 [ 639 ] DEBUG : http://localhost:None "GET /v1.46/containers/1026393d777cfb61c4783ea6a64a60002d42176961643a4af25fd95ce70ae5f5/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-11-13 08:32:18.273000 [ 639 ] DEBUG : http://localhost:None "GET /v1.46/containers/1026393d777cfb61c4783ea6a64a60002d42176961643a4af25fd95ce70ae5f5/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-11-13 08:32:18.376000 [ 639 ] DEBUG : http://localhost:None "GET /v1.46/containers/1026393d777cfb61c4783ea6a64a60002d42176961643a4af25fd95ce70ae5f5/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-11-13 08:32:18.480000 [ 639 ] DEBUG : http://localhost:None "GET /v1.46/containers/1026393d777cfb61c4783ea6a64a60002d42176961643a4af25fd95ce70ae5f5/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-11-13 08:32:18.584000 [ 639 ] DEBUG : http://localhost:None "GET /v1.46/containers/1026393d777cfb61c4783ea6a64a60002d42176961643a4af25fd95ce70ae5f5/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-11-13 08:32:18.689000 [ 639 ] DEBUG : http://localhost:None "GET /v1.46/containers/1026393d777cfb61c4783ea6a64a60002d42176961643a4af25fd95ce70ae5f5/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-11-13 08:32:18.792000 [ 639 ] DEBUG : http://localhost:None "GET /v1.46/containers/1026393d777cfb61c4783ea6a64a60002d42176961643a4af25fd95ce70ae5f5/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-11-13 08:32:18.897000 [ 639 ] DEBUG : http://localhost:None "GET /v1.46/containers/1026393d777cfb61c4783ea6a64a60002d42176961643a4af25fd95ce70ae5f5/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-11-13 08:32:19.001000 [ 639 ] DEBUG : http://localhost:None "GET /v1.46/containers/1026393d777cfb61c4783ea6a64a60002d42176961643a4af25fd95ce70ae5f5/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-11-13 08:32:19.106000 [ 639 ] DEBUG : http://localhost:None "GET /v1.46/containers/1026393d777cfb61c4783ea6a64a60002d42176961643a4af25fd95ce70ae5f5/json HTTP/1.1" 200 None (connectionpool.py:547, _make_request) 2025-11-13 08:32:19.108000 [ 639 ] DEBUG : ClickHouse node1 started (cluster.py:3159, start) 2025-11-13 08:32:19.109000 [ 639 ] DEBUG : run container_id:roottestdatabasedelta-gw2-node1-1 detach:False nothrow:False cmd: ['bash', '-c', 'cd /unitycatalog && nohup bin/start-uc-server &'] (cluster.py:2051, exec_in_container) 2025-11-13 08:32:19.109000 [ 639 ] DEBUG : Command:[docker exec roottestdatabasedelta-gw2-node1-1 bash -c cd /unitycatalog && nohup bin/start-uc-server &] (cluster.py:121, run_and_check) ------------------------------ Captured log call ------------------------------- 2025-11-13 08:32:21.180000 [ 639 ] DEBUG : run container_id:roottestdatabasedelta-gw2-node1-1 detach:False nothrow:True cmd: ['bash', '-c', '\ncd /spark-3.5.4-bin-hadoop3 && bin/spark-sql --name "s3-uc-test" \\\n --master "local[*]" \\\n --packages "org.apache.hadoop:hadoop-aws:3.3.4,io.delta:delta-spark_2.12:3.2.1,io.unitycatalog:unitycatalog-spark_2.12:0.2.0" \\\n --conf "spark.sql.extensions=io.delta.sql.DeltaSparkSessionExtension" \\\n --conf "spark.sql.catalog.spark_catalog=io.unitycatalog.spark.UCSingleCatalog" \\\n --conf "spark.hadoop.fs.s3.impl=org.apache.hadoop.fs.s3a.S3AFileSystem" \\\n --conf "spark.sql.catalog.unity=io.unitycatalog.spark.UCSingleCatalog" \\\n --conf "spark.sql.catalog.unity.uri=http://localhost:8080" \\\n --conf "spark.sql.catalog.unity.token=" \\\n --conf "spark.sql.defaultCatalog=unity" \\\n -S -e "CREATE SCHEMA schema_with_complex_tables" | grep -v \'loading settings\'\n'] (cluster.py:2051, exec_in_container) 2025-11-13 08:32:21.181000 [ 639 ] DEBUG : Command:[docker exec roottestdatabasedelta-gw2-node1-1 bash -c cd /spark-3.5.4-bin-hadoop3 && bin/spark-sql --name "s3-uc-test" \ --master "local[*]" \ --packages "org.apache.hadoop:hadoop-aws:3.3.4,io.delta:delta-spark_2.12:3.2.1,io.unitycatalog:unitycatalog-spark_2.12:0.2.0" \ --conf "spark.sql.extensions=io.delta.sql.DeltaSparkSessionExtension" \ --conf "spark.sql.catalog.spark_catalog=io.unitycatalog.spark.UCSingleCatalog" \ --conf "spark.hadoop.fs.s3.impl=org.apache.hadoop.fs.s3a.S3AFileSystem" \ --conf "spark.sql.catalog.unity=io.unitycatalog.spark.UCSingleCatalog" \ --conf "spark.sql.catalog.unity.uri=http://localhost:8080" \ --conf "spark.sql.catalog.unity.token=" \ --conf "spark.sql.defaultCatalog=unity" \ -S -e "CREATE SCHEMA schema_with_complex_tables" | grep -v 'loading settings' ] (cluster.py:121, run_and_check) 2025-11-13 08:32:27.189000 [ 639 ] DEBUG : Stderr:Ivy Default Cache set to: /root/.ivy2/cache (cluster.py:147, run_and_check) 2025-11-13 08:32:27.189000 [ 639 ] DEBUG : Stderr:The jars for the packages stored in: /root/.ivy2/jars (cluster.py:147, run_and_check) 2025-11-13 08:32:27.189000 [ 639 ] DEBUG : Stderr:org.apache.hadoop#hadoop-aws added as a dependency (cluster.py:147, run_and_check) 2025-11-13 08:32:27.189000 [ 639 ] DEBUG : Stderr:io.delta#delta-spark_2.12 added as a dependency (cluster.py:147, run_and_check) 2025-11-13 08:32:27.189000 [ 639 ] DEBUG : Stderr:io.unitycatalog#unitycatalog-spark_2.12 added as a dependency (cluster.py:147, run_and_check) 2025-11-13 08:32:27.189000 [ 639 ] DEBUG : Stderr::: resolving dependencies :: org.apache.spark#spark-submit-parent-1d1605e0-d426-40f5-b169-ab4018ee53db;1.0 (cluster.py:147, run_and_check) 2025-11-13 08:32:27.189000 [ 639 ] DEBUG : Stderr: confs: [default] (cluster.py:147, run_and_check) 2025-11-13 08:32:27.190000 [ 639 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:32:27.190000 [ 639 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:32:27.190000 [ 639 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:32:27.190000 [ 639 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:32:27.190000 [ 639 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:32:27.190000 [ 639 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:32:27.190000 [ 639 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:32:27.190000 [ 639 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:32:27.190000 [ 639 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:32:27.190000 [ 639 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:32:27.190000 [ 639 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:32:27.190000 [ 639 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:32:27.190000 [ 639 ] DEBUG : Stderr::: resolution report :: resolve 4210ms :: artifacts dl 0ms (cluster.py:147, run_and_check) 2025-11-13 08:32:27.190000 [ 639 ] DEBUG : Stderr: :: modules in use: (cluster.py:147, run_and_check) 2025-11-13 08:32:27.190000 [ 639 ] DEBUG : Stderr: --------------------------------------------------------------------- (cluster.py:147, run_and_check) 2025-11-13 08:32:27.190000 [ 639 ] DEBUG : Stderr: | | modules || artifacts | (cluster.py:147, run_and_check) 2025-11-13 08:32:27.190000 [ 639 ] DEBUG : Stderr: | conf | number| search|dwnlded|evicted|| number|dwnlded| (cluster.py:147, run_and_check) 2025-11-13 08:32:27.191000 [ 639 ] DEBUG : Stderr: --------------------------------------------------------------------- (cluster.py:147, run_and_check) 2025-11-13 08:32:27.191000 [ 639 ] DEBUG : Stderr: | default | 3 | 0 | 0 | 0 || 0 | 0 | (cluster.py:147, run_and_check) 2025-11-13 08:32:27.191000 [ 639 ] DEBUG : Stderr: --------------------------------------------------------------------- (cluster.py:147, run_and_check) 2025-11-13 08:32:27.191000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:27.191000 [ 639 ] DEBUG : Stderr::: problems summary :: (cluster.py:147, run_and_check) 2025-11-13 08:32:27.191000 [ 639 ] DEBUG : Stderr::::: WARNINGS (cluster.py:147, run_and_check) 2025-11-13 08:32:27.191000 [ 639 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom (cluster.py:147, run_and_check) 2025-11-13 08:32:27.191000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:27.191000 [ 639 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar (cluster.py:147, run_and_check) 2025-11-13 08:32:27.191000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:27.191000 [ 639 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom (cluster.py:147, run_and_check) 2025-11-13 08:32:27.191000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:27.191000 [ 639 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar (cluster.py:147, run_and_check) 2025-11-13 08:32:27.191000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:27.191000 [ 639 ] DEBUG : Stderr: module not found: org.apache.hadoop#hadoop-aws;3.3.4 (cluster.py:147, run_and_check) 2025-11-13 08:32:27.191000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:27.191000 [ 639 ] DEBUG : Stderr: ==== local-m2-cache: tried (cluster.py:147, run_and_check) 2025-11-13 08:32:27.192000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:27.192000 [ 639 ] DEBUG : Stderr: file:/root/.m2/repository/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom (cluster.py:147, run_and_check) 2025-11-13 08:32:27.192000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:27.192000 [ 639 ] DEBUG : Stderr: -- artifact org.apache.hadoop#hadoop-aws;3.3.4!hadoop-aws.jar: (cluster.py:147, run_and_check) 2025-11-13 08:32:27.192000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:27.192000 [ 639 ] DEBUG : Stderr: file:/root/.m2/repository/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar (cluster.py:147, run_and_check) 2025-11-13 08:32:27.192000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:27.192000 [ 639 ] DEBUG : Stderr: ==== local-ivy-cache: tried (cluster.py:147, run_and_check) 2025-11-13 08:32:27.192000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:27.192000 [ 639 ] DEBUG : Stderr: /root/.ivy2/local/org.apache.hadoop/hadoop-aws/3.3.4/ivys/ivy.xml (cluster.py:147, run_and_check) 2025-11-13 08:32:27.192000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:27.192000 [ 639 ] DEBUG : Stderr: -- artifact org.apache.hadoop#hadoop-aws;3.3.4!hadoop-aws.jar: (cluster.py:147, run_and_check) 2025-11-13 08:32:27.192000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:27.192000 [ 639 ] DEBUG : Stderr: /root/.ivy2/local/org.apache.hadoop/hadoop-aws/3.3.4/jars/hadoop-aws.jar (cluster.py:147, run_and_check) 2025-11-13 08:32:27.192000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:27.192000 [ 639 ] DEBUG : Stderr: ==== central: tried (cluster.py:147, run_and_check) 2025-11-13 08:32:27.193000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:27.193000 [ 639 ] DEBUG : Stderr: https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom (cluster.py:147, run_and_check) 2025-11-13 08:32:27.193000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:27.193000 [ 639 ] DEBUG : Stderr: -- artifact org.apache.hadoop#hadoop-aws;3.3.4!hadoop-aws.jar: (cluster.py:147, run_and_check) 2025-11-13 08:32:27.193000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:27.193000 [ 639 ] DEBUG : Stderr: https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar (cluster.py:147, run_and_check) 2025-11-13 08:32:27.193000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:27.193000 [ 639 ] DEBUG : Stderr: ==== spark-packages: tried (cluster.py:147, run_and_check) 2025-11-13 08:32:27.193000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:27.193000 [ 639 ] DEBUG : Stderr: https://repos.spark-packages.org/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom (cluster.py:147, run_and_check) 2025-11-13 08:32:27.193000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:27.193000 [ 639 ] DEBUG : Stderr: -- artifact org.apache.hadoop#hadoop-aws;3.3.4!hadoop-aws.jar: (cluster.py:147, run_and_check) 2025-11-13 08:32:27.193000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:27.193000 [ 639 ] DEBUG : Stderr: https://repos.spark-packages.org/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar (cluster.py:147, run_and_check) 2025-11-13 08:32:27.194000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:27.194000 [ 639 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom (cluster.py:147, run_and_check) 2025-11-13 08:32:27.194000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:27.194000 [ 639 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar (cluster.py:147, run_and_check) 2025-11-13 08:32:27.194000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:27.194000 [ 639 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom (cluster.py:147, run_and_check) 2025-11-13 08:32:27.194000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:27.194000 [ 639 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar (cluster.py:147, run_and_check) 2025-11-13 08:32:27.194000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:27.194000 [ 639 ] DEBUG : Stderr: module not found: io.delta#delta-spark_2.12;3.2.1 (cluster.py:147, run_and_check) 2025-11-13 08:32:27.194000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:27.194000 [ 639 ] DEBUG : Stderr: ==== local-m2-cache: tried (cluster.py:147, run_and_check) 2025-11-13 08:32:27.194000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:27.194000 [ 639 ] DEBUG : Stderr: file:/root/.m2/repository/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom (cluster.py:147, run_and_check) 2025-11-13 08:32:27.194000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:27.194000 [ 639 ] DEBUG : Stderr: -- artifact io.delta#delta-spark_2.12;3.2.1!delta-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-11-13 08:32:27.194000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:27.195000 [ 639 ] DEBUG : Stderr: file:/root/.m2/repository/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar (cluster.py:147, run_and_check) 2025-11-13 08:32:27.195000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:27.195000 [ 639 ] DEBUG : Stderr: ==== local-ivy-cache: tried (cluster.py:147, run_and_check) 2025-11-13 08:32:27.195000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:27.195000 [ 639 ] DEBUG : Stderr: /root/.ivy2/local/io.delta/delta-spark_2.12/3.2.1/ivys/ivy.xml (cluster.py:147, run_and_check) 2025-11-13 08:32:27.195000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:27.195000 [ 639 ] DEBUG : Stderr: -- artifact io.delta#delta-spark_2.12;3.2.1!delta-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-11-13 08:32:27.195000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:27.195000 [ 639 ] DEBUG : Stderr: /root/.ivy2/local/io.delta/delta-spark_2.12/3.2.1/jars/delta-spark_2.12.jar (cluster.py:147, run_and_check) 2025-11-13 08:32:27.195000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:27.195000 [ 639 ] DEBUG : Stderr: ==== central: tried (cluster.py:147, run_and_check) 2025-11-13 08:32:27.195000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:27.195000 [ 639 ] DEBUG : Stderr: https://repo1.maven.org/maven2/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom (cluster.py:147, run_and_check) 2025-11-13 08:32:27.195000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:27.195000 [ 639 ] DEBUG : Stderr: -- artifact io.delta#delta-spark_2.12;3.2.1!delta-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-11-13 08:32:27.195000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:27.195000 [ 639 ] DEBUG : Stderr: https://repo1.maven.org/maven2/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar (cluster.py:147, run_and_check) 2025-11-13 08:32:27.196000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:27.196000 [ 639 ] DEBUG : Stderr: ==== spark-packages: tried (cluster.py:147, run_and_check) 2025-11-13 08:32:27.196000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:27.196000 [ 639 ] DEBUG : Stderr: https://repos.spark-packages.org/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom (cluster.py:147, run_and_check) 2025-11-13 08:32:27.196000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:27.196000 [ 639 ] DEBUG : Stderr: -- artifact io.delta#delta-spark_2.12;3.2.1!delta-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-11-13 08:32:27.196000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:27.196000 [ 639 ] DEBUG : Stderr: https://repos.spark-packages.org/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar (cluster.py:147, run_and_check) 2025-11-13 08:32:27.196000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:27.196000 [ 639 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom (cluster.py:147, run_and_check) 2025-11-13 08:32:27.196000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:27.196000 [ 639 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar (cluster.py:147, run_and_check) 2025-11-13 08:32:27.196000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:27.196000 [ 639 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom (cluster.py:147, run_and_check) 2025-11-13 08:32:27.196000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:27.196000 [ 639 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar (cluster.py:147, run_and_check) 2025-11-13 08:32:27.197000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:27.197000 [ 639 ] DEBUG : Stderr: module not found: io.unitycatalog#unitycatalog-spark_2.12;0.2.0 (cluster.py:147, run_and_check) 2025-11-13 08:32:27.197000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:27.197000 [ 639 ] DEBUG : Stderr: ==== local-m2-cache: tried (cluster.py:147, run_and_check) 2025-11-13 08:32:27.197000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:27.197000 [ 639 ] DEBUG : Stderr: file:/root/.m2/repository/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom (cluster.py:147, run_and_check) 2025-11-13 08:32:27.197000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:27.197000 [ 639 ] DEBUG : Stderr: -- artifact io.unitycatalog#unitycatalog-spark_2.12;0.2.0!unitycatalog-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-11-13 08:32:27.197000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:27.197000 [ 639 ] DEBUG : Stderr: file:/root/.m2/repository/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar (cluster.py:147, run_and_check) 2025-11-13 08:32:27.197000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:27.197000 [ 639 ] DEBUG : Stderr: ==== local-ivy-cache: tried (cluster.py:147, run_and_check) 2025-11-13 08:32:27.197000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:27.197000 [ 639 ] DEBUG : Stderr: /root/.ivy2/local/io.unitycatalog/unitycatalog-spark_2.12/0.2.0/ivys/ivy.xml (cluster.py:147, run_and_check) 2025-11-13 08:32:27.197000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:27.197000 [ 639 ] DEBUG : Stderr: -- artifact io.unitycatalog#unitycatalog-spark_2.12;0.2.0!unitycatalog-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-11-13 08:32:27.197000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:27.198000 [ 639 ] DEBUG : Stderr: /root/.ivy2/local/io.unitycatalog/unitycatalog-spark_2.12/0.2.0/jars/unitycatalog-spark_2.12.jar (cluster.py:147, run_and_check) 2025-11-13 08:32:27.198000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:27.198000 [ 639 ] DEBUG : Stderr: ==== central: tried (cluster.py:147, run_and_check) 2025-11-13 08:32:27.198000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:27.198000 [ 639 ] DEBUG : Stderr: https://repo1.maven.org/maven2/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom (cluster.py:147, run_and_check) 2025-11-13 08:32:27.198000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:27.198000 [ 639 ] DEBUG : Stderr: -- artifact io.unitycatalog#unitycatalog-spark_2.12;0.2.0!unitycatalog-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-11-13 08:32:27.198000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:27.198000 [ 639 ] DEBUG : Stderr: https://repo1.maven.org/maven2/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar (cluster.py:147, run_and_check) 2025-11-13 08:32:27.198000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:27.198000 [ 639 ] DEBUG : Stderr: ==== spark-packages: tried (cluster.py:147, run_and_check) 2025-11-13 08:32:27.198000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:27.198000 [ 639 ] DEBUG : Stderr: https://repos.spark-packages.org/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom (cluster.py:147, run_and_check) 2025-11-13 08:32:27.198000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:27.198000 [ 639 ] DEBUG : Stderr: -- artifact io.unitycatalog#unitycatalog-spark_2.12;0.2.0!unitycatalog-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-11-13 08:32:27.198000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:27.198000 [ 639 ] DEBUG : Stderr: https://repos.spark-packages.org/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar (cluster.py:147, run_and_check) 2025-11-13 08:32:27.199000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:27.199000 [ 639 ] DEBUG : Stderr: :::::::::::::::::::::::::::::::::::::::::::::: (cluster.py:147, run_and_check) 2025-11-13 08:32:27.199000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:27.199000 [ 639 ] DEBUG : Stderr: :: UNRESOLVED DEPENDENCIES :: (cluster.py:147, run_and_check) 2025-11-13 08:32:27.199000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:27.199000 [ 639 ] DEBUG : Stderr: :::::::::::::::::::::::::::::::::::::::::::::: (cluster.py:147, run_and_check) 2025-11-13 08:32:27.199000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:27.199000 [ 639 ] DEBUG : Stderr: :: org.apache.hadoop#hadoop-aws;3.3.4: not found (cluster.py:147, run_and_check) 2025-11-13 08:32:27.199000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:27.199000 [ 639 ] DEBUG : Stderr: :: io.delta#delta-spark_2.12;3.2.1: not found (cluster.py:147, run_and_check) 2025-11-13 08:32:27.199000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:27.199000 [ 639 ] DEBUG : Stderr: :: io.unitycatalog#unitycatalog-spark_2.12;0.2.0: not found (cluster.py:147, run_and_check) 2025-11-13 08:32:27.199000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:27.199000 [ 639 ] DEBUG : Stderr: :::::::::::::::::::::::::::::::::::::::::::::: (cluster.py:147, run_and_check) 2025-11-13 08:32:27.199000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:27.199000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:27.199000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:27.200000 [ 639 ] DEBUG : Stderr::: USE VERBOSE OR DEBUG MESSAGE LEVEL FOR MORE DETAILS (cluster.py:147, run_and_check) 2025-11-13 08:32:27.200000 [ 639 ] DEBUG : Stderr:Exception in thread "main" java.lang.RuntimeException: [unresolved dependency: org.apache.hadoop#hadoop-aws;3.3.4: not found, unresolved dependency: io.delta#delta-spark_2.12;3.2.1: not found, unresolved dependency: io.unitycatalog#unitycatalog-spark_2.12;0.2.0: not found] (cluster.py:147, run_and_check) 2025-11-13 08:32:27.200000 [ 639 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmitUtils$.resolveMavenCoordinates(SparkSubmit.scala:1613) (cluster.py:147, run_and_check) 2025-11-13 08:32:27.200000 [ 639 ] DEBUG : Stderr: at org.apache.spark.util.DependencyUtils$.resolveMavenDependencies(DependencyUtils.scala:185) (cluster.py:147, run_and_check) 2025-11-13 08:32:27.200000 [ 639 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.prepareSubmitEnvironment(SparkSubmit.scala:339) (cluster.py:147, run_and_check) 2025-11-13 08:32:27.200000 [ 639 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:969) (cluster.py:147, run_and_check) 2025-11-13 08:32:27.200000 [ 639 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:199) (cluster.py:147, run_and_check) 2025-11-13 08:32:27.200000 [ 639 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:222) (cluster.py:147, run_and_check) 2025-11-13 08:32:27.200000 [ 639 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:91) (cluster.py:147, run_and_check) 2025-11-13 08:32:27.200000 [ 639 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1125) (cluster.py:147, run_and_check) 2025-11-13 08:32:27.200000 [ 639 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1134) (cluster.py:147, run_and_check) 2025-11-13 08:32:27.200000 [ 639 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) (cluster.py:147, run_and_check) 2025-11-13 08:32:27.200000 [ 639 ] DEBUG : Exitcode:1 (cluster.py:149, run_and_check) 2025-11-13 08:32:27.200000 [ 639 ] DEBUG : run container_id:roottestdatabasedelta-gw2-node1-1 detach:False nothrow:True cmd: ['bash', '-c', '\ncd /spark-3.5.4-bin-hadoop3 && bin/spark-sql --name "s3-uc-test" \\\n --master "local[*]" \\\n --packages "org.apache.hadoop:hadoop-aws:3.3.4,io.delta:delta-spark_2.12:3.2.1,io.unitycatalog:unitycatalog-spark_2.12:0.2.0" \\\n --conf "spark.sql.extensions=io.delta.sql.DeltaSparkSessionExtension" \\\n --conf "spark.sql.catalog.spark_catalog=io.unitycatalog.spark.UCSingleCatalog" \\\n --conf "spark.hadoop.fs.s3.impl=org.apache.hadoop.fs.s3a.S3AFileSystem" \\\n --conf "spark.sql.catalog.unity=io.unitycatalog.spark.UCSingleCatalog" \\\n --conf "spark.sql.catalog.unity.uri=http://localhost:8080" \\\n --conf "spark.sql.catalog.unity.token=" \\\n --conf "spark.sql.defaultCatalog=unity" \\\n -S -e "CREATE TABLE schema_with_complex_tables.complex_table (event_date DATE, event_time TIMESTAMP, hits ARRAY, ids MAP, really_complex STRUCT) using Delta location \'/tmp/complex_schema/complex_table\'" | grep -v \'loading settings\'\n'] (cluster.py:2051, exec_in_container) 2025-11-13 08:32:27.200000 [ 639 ] DEBUG : Command:[docker exec roottestdatabasedelta-gw2-node1-1 bash -c cd /spark-3.5.4-bin-hadoop3 && bin/spark-sql --name "s3-uc-test" \ --master "local[*]" \ --packages "org.apache.hadoop:hadoop-aws:3.3.4,io.delta:delta-spark_2.12:3.2.1,io.unitycatalog:unitycatalog-spark_2.12:0.2.0" \ --conf "spark.sql.extensions=io.delta.sql.DeltaSparkSessionExtension" \ --conf "spark.sql.catalog.spark_catalog=io.unitycatalog.spark.UCSingleCatalog" \ --conf "spark.hadoop.fs.s3.impl=org.apache.hadoop.fs.s3a.S3AFileSystem" \ --conf "spark.sql.catalog.unity=io.unitycatalog.spark.UCSingleCatalog" \ --conf "spark.sql.catalog.unity.uri=http://localhost:8080" \ --conf "spark.sql.catalog.unity.token=" \ --conf "spark.sql.defaultCatalog=unity" \ -S -e "CREATE TABLE schema_with_complex_tables.complex_table (event_date DATE, event_time TIMESTAMP, hits ARRAY, ids MAP, really_complex STRUCT) using Delta location '/tmp/complex_schema/complex_table'" | grep -v 'loading settings' ] (cluster.py:121, run_and_check) 2025-11-13 08:32:33.110000 [ 639 ] DEBUG : Stderr:Ivy Default Cache set to: /root/.ivy2/cache (cluster.py:147, run_and_check) 2025-11-13 08:32:33.110000 [ 639 ] DEBUG : Stderr:The jars for the packages stored in: /root/.ivy2/jars (cluster.py:147, run_and_check) 2025-11-13 08:32:33.110000 [ 639 ] DEBUG : Stderr:org.apache.hadoop#hadoop-aws added as a dependency (cluster.py:147, run_and_check) 2025-11-13 08:32:33.110000 [ 639 ] DEBUG : Stderr:io.delta#delta-spark_2.12 added as a dependency (cluster.py:147, run_and_check) 2025-11-13 08:32:33.110000 [ 639 ] DEBUG : Stderr:io.unitycatalog#unitycatalog-spark_2.12 added as a dependency (cluster.py:147, run_and_check) 2025-11-13 08:32:33.110000 [ 639 ] DEBUG : Stderr::: resolving dependencies :: org.apache.spark#spark-submit-parent-3fc4076b-3bc9-4dab-8ce2-7f8b1bb4c9a8;1.0 (cluster.py:147, run_and_check) 2025-11-13 08:32:33.110000 [ 639 ] DEBUG : Stderr: confs: [default] (cluster.py:147, run_and_check) 2025-11-13 08:32:33.111000 [ 639 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:32:33.111000 [ 639 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:32:33.111000 [ 639 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:32:33.111000 [ 639 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:32:33.111000 [ 639 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:32:33.111000 [ 639 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:32:33.111000 [ 639 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:32:33.111000 [ 639 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:32:33.111000 [ 639 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:32:33.111000 [ 639 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:32:33.111000 [ 639 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:32:33.111000 [ 639 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:32:33.111000 [ 639 ] DEBUG : Stderr::: resolution report :: resolve 4217ms :: artifacts dl 0ms (cluster.py:147, run_and_check) 2025-11-13 08:32:33.112000 [ 639 ] DEBUG : Stderr: :: modules in use: (cluster.py:147, run_and_check) 2025-11-13 08:32:33.112000 [ 639 ] DEBUG : Stderr: --------------------------------------------------------------------- (cluster.py:147, run_and_check) 2025-11-13 08:32:33.112000 [ 639 ] DEBUG : Stderr: | | modules || artifacts | (cluster.py:147, run_and_check) 2025-11-13 08:32:33.112000 [ 639 ] DEBUG : Stderr: | conf | number| search|dwnlded|evicted|| number|dwnlded| (cluster.py:147, run_and_check) 2025-11-13 08:32:33.112000 [ 639 ] DEBUG : Stderr: --------------------------------------------------------------------- (cluster.py:147, run_and_check) 2025-11-13 08:32:33.112000 [ 639 ] DEBUG : Stderr: | default | 3 | 0 | 0 | 0 || 0 | 0 | (cluster.py:147, run_and_check) 2025-11-13 08:32:33.112000 [ 639 ] DEBUG : Stderr: --------------------------------------------------------------------- (cluster.py:147, run_and_check) 2025-11-13 08:32:33.112000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:33.112000 [ 639 ] DEBUG : Stderr::: problems summary :: (cluster.py:147, run_and_check) 2025-11-13 08:32:33.112000 [ 639 ] DEBUG : Stderr::::: WARNINGS (cluster.py:147, run_and_check) 2025-11-13 08:32:33.112000 [ 639 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom (cluster.py:147, run_and_check) 2025-11-13 08:32:33.113000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:33.113000 [ 639 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar (cluster.py:147, run_and_check) 2025-11-13 08:32:33.113000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:33.113000 [ 639 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom (cluster.py:147, run_and_check) 2025-11-13 08:32:33.113000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:33.113000 [ 639 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar (cluster.py:147, run_and_check) 2025-11-13 08:32:33.113000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:33.113000 [ 639 ] DEBUG : Stderr: module not found: org.apache.hadoop#hadoop-aws;3.3.4 (cluster.py:147, run_and_check) 2025-11-13 08:32:33.113000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:33.113000 [ 639 ] DEBUG : Stderr: ==== local-m2-cache: tried (cluster.py:147, run_and_check) 2025-11-13 08:32:33.114000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:33.114000 [ 639 ] DEBUG : Stderr: file:/root/.m2/repository/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom (cluster.py:147, run_and_check) 2025-11-13 08:32:33.114000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:33.114000 [ 639 ] DEBUG : Stderr: -- artifact org.apache.hadoop#hadoop-aws;3.3.4!hadoop-aws.jar: (cluster.py:147, run_and_check) 2025-11-13 08:32:33.114000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:33.114000 [ 639 ] DEBUG : Stderr: file:/root/.m2/repository/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar (cluster.py:147, run_and_check) 2025-11-13 08:32:33.114000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:33.114000 [ 639 ] DEBUG : Stderr: ==== local-ivy-cache: tried (cluster.py:147, run_and_check) 2025-11-13 08:32:33.114000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:33.114000 [ 639 ] DEBUG : Stderr: /root/.ivy2/local/org.apache.hadoop/hadoop-aws/3.3.4/ivys/ivy.xml (cluster.py:147, run_and_check) 2025-11-13 08:32:33.114000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:33.114000 [ 639 ] DEBUG : Stderr: -- artifact org.apache.hadoop#hadoop-aws;3.3.4!hadoop-aws.jar: (cluster.py:147, run_and_check) 2025-11-13 08:32:33.115000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:33.115000 [ 639 ] DEBUG : Stderr: /root/.ivy2/local/org.apache.hadoop/hadoop-aws/3.3.4/jars/hadoop-aws.jar (cluster.py:147, run_and_check) 2025-11-13 08:32:33.115000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:33.115000 [ 639 ] DEBUG : Stderr: ==== central: tried (cluster.py:147, run_and_check) 2025-11-13 08:32:33.115000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:33.115000 [ 639 ] DEBUG : Stderr: https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom (cluster.py:147, run_and_check) 2025-11-13 08:32:33.115000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:33.115000 [ 639 ] DEBUG : Stderr: -- artifact org.apache.hadoop#hadoop-aws;3.3.4!hadoop-aws.jar: (cluster.py:147, run_and_check) 2025-11-13 08:32:33.115000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:33.115000 [ 639 ] DEBUG : Stderr: https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar (cluster.py:147, run_and_check) 2025-11-13 08:32:33.115000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:33.115000 [ 639 ] DEBUG : Stderr: ==== spark-packages: tried (cluster.py:147, run_and_check) 2025-11-13 08:32:33.115000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:33.116000 [ 639 ] DEBUG : Stderr: https://repos.spark-packages.org/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom (cluster.py:147, run_and_check) 2025-11-13 08:32:33.116000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:33.116000 [ 639 ] DEBUG : Stderr: -- artifact org.apache.hadoop#hadoop-aws;3.3.4!hadoop-aws.jar: (cluster.py:147, run_and_check) 2025-11-13 08:32:33.116000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:33.116000 [ 639 ] DEBUG : Stderr: https://repos.spark-packages.org/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar (cluster.py:147, run_and_check) 2025-11-13 08:32:33.116000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:33.116000 [ 639 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom (cluster.py:147, run_and_check) 2025-11-13 08:32:33.116000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:33.116000 [ 639 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar (cluster.py:147, run_and_check) 2025-11-13 08:32:33.116000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:33.116000 [ 639 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom (cluster.py:147, run_and_check) 2025-11-13 08:32:33.116000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:33.117000 [ 639 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar (cluster.py:147, run_and_check) 2025-11-13 08:32:33.117000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:33.117000 [ 639 ] DEBUG : Stderr: module not found: io.delta#delta-spark_2.12;3.2.1 (cluster.py:147, run_and_check) 2025-11-13 08:32:33.117000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:33.117000 [ 639 ] DEBUG : Stderr: ==== local-m2-cache: tried (cluster.py:147, run_and_check) 2025-11-13 08:32:33.117000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:33.117000 [ 639 ] DEBUG : Stderr: file:/root/.m2/repository/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom (cluster.py:147, run_and_check) 2025-11-13 08:32:33.117000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:33.117000 [ 639 ] DEBUG : Stderr: -- artifact io.delta#delta-spark_2.12;3.2.1!delta-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-11-13 08:32:33.117000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:33.117000 [ 639 ] DEBUG : Stderr: file:/root/.m2/repository/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar (cluster.py:147, run_and_check) 2025-11-13 08:32:33.117000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:33.117000 [ 639 ] DEBUG : Stderr: ==== local-ivy-cache: tried (cluster.py:147, run_and_check) 2025-11-13 08:32:33.118000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:33.118000 [ 639 ] DEBUG : Stderr: /root/.ivy2/local/io.delta/delta-spark_2.12/3.2.1/ivys/ivy.xml (cluster.py:147, run_and_check) 2025-11-13 08:32:33.118000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:33.118000 [ 639 ] DEBUG : Stderr: -- artifact io.delta#delta-spark_2.12;3.2.1!delta-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-11-13 08:32:33.118000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:33.118000 [ 639 ] DEBUG : Stderr: /root/.ivy2/local/io.delta/delta-spark_2.12/3.2.1/jars/delta-spark_2.12.jar (cluster.py:147, run_and_check) 2025-11-13 08:32:33.118000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:33.118000 [ 639 ] DEBUG : Stderr: ==== central: tried (cluster.py:147, run_and_check) 2025-11-13 08:32:33.118000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:33.118000 [ 639 ] DEBUG : Stderr: https://repo1.maven.org/maven2/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom (cluster.py:147, run_and_check) 2025-11-13 08:32:33.118000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:33.119000 [ 639 ] DEBUG : Stderr: -- artifact io.delta#delta-spark_2.12;3.2.1!delta-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-11-13 08:32:33.119000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:33.119000 [ 639 ] DEBUG : Stderr: https://repo1.maven.org/maven2/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar (cluster.py:147, run_and_check) 2025-11-13 08:32:33.119000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:33.119000 [ 639 ] DEBUG : Stderr: ==== spark-packages: tried (cluster.py:147, run_and_check) 2025-11-13 08:32:33.119000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:33.119000 [ 639 ] DEBUG : Stderr: https://repos.spark-packages.org/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom (cluster.py:147, run_and_check) 2025-11-13 08:32:33.119000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:33.119000 [ 639 ] DEBUG : Stderr: -- artifact io.delta#delta-spark_2.12;3.2.1!delta-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-11-13 08:32:33.119000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:33.119000 [ 639 ] DEBUG : Stderr: https://repos.spark-packages.org/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar (cluster.py:147, run_and_check) 2025-11-13 08:32:33.120000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:33.120000 [ 639 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom (cluster.py:147, run_and_check) 2025-11-13 08:32:33.120000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:33.120000 [ 639 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar (cluster.py:147, run_and_check) 2025-11-13 08:32:33.120000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:33.120000 [ 639 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom (cluster.py:147, run_and_check) 2025-11-13 08:32:33.120000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:33.120000 [ 639 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar (cluster.py:147, run_and_check) 2025-11-13 08:32:33.120000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:33.120000 [ 639 ] DEBUG : Stderr: module not found: io.unitycatalog#unitycatalog-spark_2.12;0.2.0 (cluster.py:147, run_and_check) 2025-11-13 08:32:33.120000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:33.121000 [ 639 ] DEBUG : Stderr: ==== local-m2-cache: tried (cluster.py:147, run_and_check) 2025-11-13 08:32:33.121000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:33.121000 [ 639 ] DEBUG : Stderr: file:/root/.m2/repository/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom (cluster.py:147, run_and_check) 2025-11-13 08:32:33.121000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:33.121000 [ 639 ] DEBUG : Stderr: -- artifact io.unitycatalog#unitycatalog-spark_2.12;0.2.0!unitycatalog-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-11-13 08:32:33.121000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:33.121000 [ 639 ] DEBUG : Stderr: file:/root/.m2/repository/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar (cluster.py:147, run_and_check) 2025-11-13 08:32:33.121000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:33.121000 [ 639 ] DEBUG : Stderr: ==== local-ivy-cache: tried (cluster.py:147, run_and_check) 2025-11-13 08:32:33.121000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:33.121000 [ 639 ] DEBUG : Stderr: /root/.ivy2/local/io.unitycatalog/unitycatalog-spark_2.12/0.2.0/ivys/ivy.xml (cluster.py:147, run_and_check) 2025-11-13 08:32:33.121000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:33.121000 [ 639 ] DEBUG : Stderr: -- artifact io.unitycatalog#unitycatalog-spark_2.12;0.2.0!unitycatalog-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-11-13 08:32:33.121000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:33.121000 [ 639 ] DEBUG : Stderr: /root/.ivy2/local/io.unitycatalog/unitycatalog-spark_2.12/0.2.0/jars/unitycatalog-spark_2.12.jar (cluster.py:147, run_and_check) 2025-11-13 08:32:33.122000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:33.122000 [ 639 ] DEBUG : Stderr: ==== central: tried (cluster.py:147, run_and_check) 2025-11-13 08:32:33.122000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:33.122000 [ 639 ] DEBUG : Stderr: https://repo1.maven.org/maven2/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom (cluster.py:147, run_and_check) 2025-11-13 08:32:33.122000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:33.122000 [ 639 ] DEBUG : Stderr: -- artifact io.unitycatalog#unitycatalog-spark_2.12;0.2.0!unitycatalog-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-11-13 08:32:33.122000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:33.122000 [ 639 ] DEBUG : Stderr: https://repo1.maven.org/maven2/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar (cluster.py:147, run_and_check) 2025-11-13 08:32:33.122000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:33.122000 [ 639 ] DEBUG : Stderr: ==== spark-packages: tried (cluster.py:147, run_and_check) 2025-11-13 08:32:33.122000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:33.122000 [ 639 ] DEBUG : Stderr: https://repos.spark-packages.org/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom (cluster.py:147, run_and_check) 2025-11-13 08:32:33.123000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:33.123000 [ 639 ] DEBUG : Stderr: -- artifact io.unitycatalog#unitycatalog-spark_2.12;0.2.0!unitycatalog-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-11-13 08:32:33.123000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:33.123000 [ 639 ] DEBUG : Stderr: https://repos.spark-packages.org/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar (cluster.py:147, run_and_check) 2025-11-13 08:32:33.123000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:33.123000 [ 639 ] DEBUG : Stderr: :::::::::::::::::::::::::::::::::::::::::::::: (cluster.py:147, run_and_check) 2025-11-13 08:32:33.123000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:33.123000 [ 639 ] DEBUG : Stderr: :: UNRESOLVED DEPENDENCIES :: (cluster.py:147, run_and_check) 2025-11-13 08:32:33.123000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:33.123000 [ 639 ] DEBUG : Stderr: :::::::::::::::::::::::::::::::::::::::::::::: (cluster.py:147, run_and_check) 2025-11-13 08:32:33.123000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:33.123000 [ 639 ] DEBUG : Stderr: :: org.apache.hadoop#hadoop-aws;3.3.4: not found (cluster.py:147, run_and_check) 2025-11-13 08:32:33.123000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:33.123000 [ 639 ] DEBUG : Stderr: :: io.delta#delta-spark_2.12;3.2.1: not found (cluster.py:147, run_and_check) 2025-11-13 08:32:33.124000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:33.124000 [ 639 ] DEBUG : Stderr: :: io.unitycatalog#unitycatalog-spark_2.12;0.2.0: not found (cluster.py:147, run_and_check) 2025-11-13 08:32:33.124000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:33.124000 [ 639 ] DEBUG : Stderr: :::::::::::::::::::::::::::::::::::::::::::::: (cluster.py:147, run_and_check) 2025-11-13 08:32:33.124000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:33.124000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:33.124000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:33.124000 [ 639 ] DEBUG : Stderr::: USE VERBOSE OR DEBUG MESSAGE LEVEL FOR MORE DETAILS (cluster.py:147, run_and_check) 2025-11-13 08:32:33.124000 [ 639 ] DEBUG : Stderr:Exception in thread "main" java.lang.RuntimeException: [unresolved dependency: org.apache.hadoop#hadoop-aws;3.3.4: not found, unresolved dependency: io.delta#delta-spark_2.12;3.2.1: not found, unresolved dependency: io.unitycatalog#unitycatalog-spark_2.12;0.2.0: not found] (cluster.py:147, run_and_check) 2025-11-13 08:32:33.124000 [ 639 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmitUtils$.resolveMavenCoordinates(SparkSubmit.scala:1613) (cluster.py:147, run_and_check) 2025-11-13 08:32:33.124000 [ 639 ] DEBUG : Stderr: at org.apache.spark.util.DependencyUtils$.resolveMavenDependencies(DependencyUtils.scala:185) (cluster.py:147, run_and_check) 2025-11-13 08:32:33.124000 [ 639 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.prepareSubmitEnvironment(SparkSubmit.scala:339) (cluster.py:147, run_and_check) 2025-11-13 08:32:33.124000 [ 639 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:969) (cluster.py:147, run_and_check) 2025-11-13 08:32:33.125000 [ 639 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:199) (cluster.py:147, run_and_check) 2025-11-13 08:32:33.125000 [ 639 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:222) (cluster.py:147, run_and_check) 2025-11-13 08:32:33.125000 [ 639 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:91) (cluster.py:147, run_and_check) 2025-11-13 08:32:33.125000 [ 639 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1125) (cluster.py:147, run_and_check) 2025-11-13 08:32:33.125000 [ 639 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1134) (cluster.py:147, run_and_check) 2025-11-13 08:32:33.125000 [ 639 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) (cluster.py:147, run_and_check) 2025-11-13 08:32:33.125000 [ 639 ] DEBUG : Exitcode:1 (cluster.py:149, run_and_check) 2025-11-13 08:32:33.125000 [ 639 ] DEBUG : run container_id:roottestdatabasedelta-gw2-node1-1 detach:False nothrow:True cmd: ['bash', '-c', '\ncd /spark-3.5.4-bin-hadoop3 && bin/spark-sql --name "s3-uc-test" \\\n --master "local[*]" \\\n --packages "org.apache.hadoop:hadoop-aws:3.3.4,io.delta:delta-spark_2.12:3.2.1,io.unitycatalog:unitycatalog-spark_2.12:0.2.0" \\\n --conf "spark.sql.extensions=io.delta.sql.DeltaSparkSessionExtension" \\\n --conf "spark.sql.catalog.spark_catalog=io.unitycatalog.spark.UCSingleCatalog" \\\n --conf "spark.hadoop.fs.s3.impl=org.apache.hadoop.fs.s3a.S3AFileSystem" \\\n --conf "spark.sql.catalog.unity=io.unitycatalog.spark.UCSingleCatalog" \\\n --conf "spark.sql.catalog.unity.uri=http://localhost:8080" \\\n --conf "spark.sql.catalog.unity.token=" \\\n --conf "spark.sql.defaultCatalog=unity" \\\n -S -e "insert into schema_with_complex_tables.complex_table SELECT to_date(\'2024-10-01\', \'yyyy-MM-dd\'), to_timestamp(\'2024-10-01 00:12:00\'), array(42, 123, 77), map(7, \'v7\', 5, \'v5\'), named_struct(\\"f1\\", 34, \\"f2\\", \'hello\')" | grep -v \'loading settings\'\n'] (cluster.py:2051, exec_in_container) 2025-11-13 08:32:33.125000 [ 639 ] DEBUG : Command:[docker exec roottestdatabasedelta-gw2-node1-1 bash -c cd /spark-3.5.4-bin-hadoop3 && bin/spark-sql --name "s3-uc-test" \ --master "local[*]" \ --packages "org.apache.hadoop:hadoop-aws:3.3.4,io.delta:delta-spark_2.12:3.2.1,io.unitycatalog:unitycatalog-spark_2.12:0.2.0" \ --conf "spark.sql.extensions=io.delta.sql.DeltaSparkSessionExtension" \ --conf "spark.sql.catalog.spark_catalog=io.unitycatalog.spark.UCSingleCatalog" \ --conf "spark.hadoop.fs.s3.impl=org.apache.hadoop.fs.s3a.S3AFileSystem" \ --conf "spark.sql.catalog.unity=io.unitycatalog.spark.UCSingleCatalog" \ --conf "spark.sql.catalog.unity.uri=http://localhost:8080" \ --conf "spark.sql.catalog.unity.token=" \ --conf "spark.sql.defaultCatalog=unity" \ -S -e "insert into schema_with_complex_tables.complex_table SELECT to_date('2024-10-01', 'yyyy-MM-dd'), to_timestamp('2024-10-01 00:12:00'), array(42, 123, 77), map(7, 'v7', 5, 'v5'), named_struct(\"f1\", 34, \"f2\", 'hello')" | grep -v 'loading settings' ] (cluster.py:121, run_and_check) 2025-11-13 08:32:38.886000 [ 639 ] DEBUG : Stderr:Ivy Default Cache set to: /root/.ivy2/cache (cluster.py:147, run_and_check) 2025-11-13 08:32:38.886000 [ 639 ] DEBUG : Stderr:The jars for the packages stored in: /root/.ivy2/jars (cluster.py:147, run_and_check) 2025-11-13 08:32:38.886000 [ 639 ] DEBUG : Stderr:org.apache.hadoop#hadoop-aws added as a dependency (cluster.py:147, run_and_check) 2025-11-13 08:32:38.887000 [ 639 ] DEBUG : Stderr:io.delta#delta-spark_2.12 added as a dependency (cluster.py:147, run_and_check) 2025-11-13 08:32:38.887000 [ 639 ] DEBUG : Stderr:io.unitycatalog#unitycatalog-spark_2.12 added as a dependency (cluster.py:147, run_and_check) 2025-11-13 08:32:38.887000 [ 639 ] DEBUG : Stderr::: resolving dependencies :: org.apache.spark#spark-submit-parent-42b634e2-1c38-4cbe-9e66-0337097dae59;1.0 (cluster.py:147, run_and_check) 2025-11-13 08:32:38.887000 [ 639 ] DEBUG : Stderr: confs: [default] (cluster.py:147, run_and_check) 2025-11-13 08:32:38.887000 [ 639 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:32:38.887000 [ 639 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:32:38.887000 [ 639 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:32:38.887000 [ 639 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:32:38.887000 [ 639 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:32:38.887000 [ 639 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:32:38.887000 [ 639 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:32:38.887000 [ 639 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:32:38.887000 [ 639 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:32:38.888000 [ 639 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:32:38.888000 [ 639 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:32:38.888000 [ 639 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:32:38.888000 [ 639 ] DEBUG : Stderr::: resolution report :: resolve 4286ms :: artifacts dl 0ms (cluster.py:147, run_and_check) 2025-11-13 08:32:38.888000 [ 639 ] DEBUG : Stderr: :: modules in use: (cluster.py:147, run_and_check) 2025-11-13 08:32:38.888000 [ 639 ] DEBUG : Stderr: --------------------------------------------------------------------- (cluster.py:147, run_and_check) 2025-11-13 08:32:38.888000 [ 639 ] DEBUG : Stderr: | | modules || artifacts | (cluster.py:147, run_and_check) 2025-11-13 08:32:38.888000 [ 639 ] DEBUG : Stderr: | conf | number| search|dwnlded|evicted|| number|dwnlded| (cluster.py:147, run_and_check) 2025-11-13 08:32:38.888000 [ 639 ] DEBUG : Stderr: --------------------------------------------------------------------- (cluster.py:147, run_and_check) 2025-11-13 08:32:38.888000 [ 639 ] DEBUG : Stderr: | default | 3 | 0 | 0 | 0 || 0 | 0 | (cluster.py:147, run_and_check) 2025-11-13 08:32:38.888000 [ 639 ] DEBUG : Stderr: --------------------------------------------------------------------- (cluster.py:147, run_and_check) 2025-11-13 08:32:38.888000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:38.888000 [ 639 ] DEBUG : Stderr::: problems summary :: (cluster.py:147, run_and_check) 2025-11-13 08:32:38.889000 [ 639 ] DEBUG : Stderr::::: WARNINGS (cluster.py:147, run_and_check) 2025-11-13 08:32:38.889000 [ 639 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom (cluster.py:147, run_and_check) 2025-11-13 08:32:38.889000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:38.889000 [ 639 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar (cluster.py:147, run_and_check) 2025-11-13 08:32:38.889000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:38.889000 [ 639 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom (cluster.py:147, run_and_check) 2025-11-13 08:32:38.889000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:38.889000 [ 639 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar (cluster.py:147, run_and_check) 2025-11-13 08:32:38.889000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:38.889000 [ 639 ] DEBUG : Stderr: module not found: org.apache.hadoop#hadoop-aws;3.3.4 (cluster.py:147, run_and_check) 2025-11-13 08:32:38.889000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:38.889000 [ 639 ] DEBUG : Stderr: ==== local-m2-cache: tried (cluster.py:147, run_and_check) 2025-11-13 08:32:38.889000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:38.889000 [ 639 ] DEBUG : Stderr: file:/root/.m2/repository/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom (cluster.py:147, run_and_check) 2025-11-13 08:32:38.890000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:38.890000 [ 639 ] DEBUG : Stderr: -- artifact org.apache.hadoop#hadoop-aws;3.3.4!hadoop-aws.jar: (cluster.py:147, run_and_check) 2025-11-13 08:32:38.890000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:38.890000 [ 639 ] DEBUG : Stderr: file:/root/.m2/repository/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar (cluster.py:147, run_and_check) 2025-11-13 08:32:38.890000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:38.890000 [ 639 ] DEBUG : Stderr: ==== local-ivy-cache: tried (cluster.py:147, run_and_check) 2025-11-13 08:32:38.890000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:38.890000 [ 639 ] DEBUG : Stderr: /root/.ivy2/local/org.apache.hadoop/hadoop-aws/3.3.4/ivys/ivy.xml (cluster.py:147, run_and_check) 2025-11-13 08:32:38.890000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:38.890000 [ 639 ] DEBUG : Stderr: -- artifact org.apache.hadoop#hadoop-aws;3.3.4!hadoop-aws.jar: (cluster.py:147, run_and_check) 2025-11-13 08:32:38.890000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:38.890000 [ 639 ] DEBUG : Stderr: /root/.ivy2/local/org.apache.hadoop/hadoop-aws/3.3.4/jars/hadoop-aws.jar (cluster.py:147, run_and_check) 2025-11-13 08:32:38.890000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:38.890000 [ 639 ] DEBUG : Stderr: ==== central: tried (cluster.py:147, run_and_check) 2025-11-13 08:32:38.891000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:38.891000 [ 639 ] DEBUG : Stderr: https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom (cluster.py:147, run_and_check) 2025-11-13 08:32:38.891000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:38.891000 [ 639 ] DEBUG : Stderr: -- artifact org.apache.hadoop#hadoop-aws;3.3.4!hadoop-aws.jar: (cluster.py:147, run_and_check) 2025-11-13 08:32:38.891000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:38.891000 [ 639 ] DEBUG : Stderr: https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar (cluster.py:147, run_and_check) 2025-11-13 08:32:38.891000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:38.891000 [ 639 ] DEBUG : Stderr: ==== spark-packages: tried (cluster.py:147, run_and_check) 2025-11-13 08:32:38.891000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:38.891000 [ 639 ] DEBUG : Stderr: https://repos.spark-packages.org/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom (cluster.py:147, run_and_check) 2025-11-13 08:32:38.891000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:38.891000 [ 639 ] DEBUG : Stderr: -- artifact org.apache.hadoop#hadoop-aws;3.3.4!hadoop-aws.jar: (cluster.py:147, run_and_check) 2025-11-13 08:32:38.891000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:38.891000 [ 639 ] DEBUG : Stderr: https://repos.spark-packages.org/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar (cluster.py:147, run_and_check) 2025-11-13 08:32:38.892000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:38.892000 [ 639 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom (cluster.py:147, run_and_check) 2025-11-13 08:32:38.892000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:38.892000 [ 639 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar (cluster.py:147, run_and_check) 2025-11-13 08:32:38.892000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:38.892000 [ 639 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom (cluster.py:147, run_and_check) 2025-11-13 08:32:38.892000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:38.892000 [ 639 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar (cluster.py:147, run_and_check) 2025-11-13 08:32:38.892000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:38.892000 [ 639 ] DEBUG : Stderr: module not found: io.delta#delta-spark_2.12;3.2.1 (cluster.py:147, run_and_check) 2025-11-13 08:32:38.892000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:38.892000 [ 639 ] DEBUG : Stderr: ==== local-m2-cache: tried (cluster.py:147, run_and_check) 2025-11-13 08:32:38.893000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:38.893000 [ 639 ] DEBUG : Stderr: file:/root/.m2/repository/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom (cluster.py:147, run_and_check) 2025-11-13 08:32:38.893000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:38.893000 [ 639 ] DEBUG : Stderr: -- artifact io.delta#delta-spark_2.12;3.2.1!delta-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-11-13 08:32:38.893000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:38.893000 [ 639 ] DEBUG : Stderr: file:/root/.m2/repository/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar (cluster.py:147, run_and_check) 2025-11-13 08:32:38.893000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:38.893000 [ 639 ] DEBUG : Stderr: ==== local-ivy-cache: tried (cluster.py:147, run_and_check) 2025-11-13 08:32:38.893000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:38.893000 [ 639 ] DEBUG : Stderr: /root/.ivy2/local/io.delta/delta-spark_2.12/3.2.1/ivys/ivy.xml (cluster.py:147, run_and_check) 2025-11-13 08:32:38.893000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:38.893000 [ 639 ] DEBUG : Stderr: -- artifact io.delta#delta-spark_2.12;3.2.1!delta-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-11-13 08:32:38.894000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:38.894000 [ 639 ] DEBUG : Stderr: /root/.ivy2/local/io.delta/delta-spark_2.12/3.2.1/jars/delta-spark_2.12.jar (cluster.py:147, run_and_check) 2025-11-13 08:32:38.894000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:38.894000 [ 639 ] DEBUG : Stderr: ==== central: tried (cluster.py:147, run_and_check) 2025-11-13 08:32:38.894000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:38.894000 [ 639 ] DEBUG : Stderr: https://repo1.maven.org/maven2/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom (cluster.py:147, run_and_check) 2025-11-13 08:32:38.894000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:38.894000 [ 639 ] DEBUG : Stderr: -- artifact io.delta#delta-spark_2.12;3.2.1!delta-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-11-13 08:32:38.894000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:38.894000 [ 639 ] DEBUG : Stderr: https://repo1.maven.org/maven2/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar (cluster.py:147, run_and_check) 2025-11-13 08:32:38.894000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:38.894000 [ 639 ] DEBUG : Stderr: ==== spark-packages: tried (cluster.py:147, run_and_check) 2025-11-13 08:32:38.894000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:38.894000 [ 639 ] DEBUG : Stderr: https://repos.spark-packages.org/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom (cluster.py:147, run_and_check) 2025-11-13 08:32:38.895000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:38.895000 [ 639 ] DEBUG : Stderr: -- artifact io.delta#delta-spark_2.12;3.2.1!delta-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-11-13 08:32:38.895000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:38.895000 [ 639 ] DEBUG : Stderr: https://repos.spark-packages.org/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar (cluster.py:147, run_and_check) 2025-11-13 08:32:38.895000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:38.895000 [ 639 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom (cluster.py:147, run_and_check) 2025-11-13 08:32:38.895000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:38.895000 [ 639 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar (cluster.py:147, run_and_check) 2025-11-13 08:32:38.895000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:38.895000 [ 639 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom (cluster.py:147, run_and_check) 2025-11-13 08:32:38.895000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:38.895000 [ 639 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar (cluster.py:147, run_and_check) 2025-11-13 08:32:38.895000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:38.896000 [ 639 ] DEBUG : Stderr: module not found: io.unitycatalog#unitycatalog-spark_2.12;0.2.0 (cluster.py:147, run_and_check) 2025-11-13 08:32:38.896000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:38.896000 [ 639 ] DEBUG : Stderr: ==== local-m2-cache: tried (cluster.py:147, run_and_check) 2025-11-13 08:32:38.896000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:38.896000 [ 639 ] DEBUG : Stderr: file:/root/.m2/repository/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom (cluster.py:147, run_and_check) 2025-11-13 08:32:38.896000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:38.896000 [ 639 ] DEBUG : Stderr: -- artifact io.unitycatalog#unitycatalog-spark_2.12;0.2.0!unitycatalog-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-11-13 08:32:38.896000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:38.896000 [ 639 ] DEBUG : Stderr: file:/root/.m2/repository/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar (cluster.py:147, run_and_check) 2025-11-13 08:32:38.896000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:38.896000 [ 639 ] DEBUG : Stderr: ==== local-ivy-cache: tried (cluster.py:147, run_and_check) 2025-11-13 08:32:38.896000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:38.896000 [ 639 ] DEBUG : Stderr: /root/.ivy2/local/io.unitycatalog/unitycatalog-spark_2.12/0.2.0/ivys/ivy.xml (cluster.py:147, run_and_check) 2025-11-13 08:32:38.896000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:38.897000 [ 639 ] DEBUG : Stderr: -- artifact io.unitycatalog#unitycatalog-spark_2.12;0.2.0!unitycatalog-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-11-13 08:32:38.897000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:38.897000 [ 639 ] DEBUG : Stderr: /root/.ivy2/local/io.unitycatalog/unitycatalog-spark_2.12/0.2.0/jars/unitycatalog-spark_2.12.jar (cluster.py:147, run_and_check) 2025-11-13 08:32:38.897000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:38.897000 [ 639 ] DEBUG : Stderr: ==== central: tried (cluster.py:147, run_and_check) 2025-11-13 08:32:38.897000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:38.897000 [ 639 ] DEBUG : Stderr: https://repo1.maven.org/maven2/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom (cluster.py:147, run_and_check) 2025-11-13 08:32:38.897000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:38.897000 [ 639 ] DEBUG : Stderr: -- artifact io.unitycatalog#unitycatalog-spark_2.12;0.2.0!unitycatalog-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-11-13 08:32:38.897000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:38.897000 [ 639 ] DEBUG : Stderr: https://repo1.maven.org/maven2/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar (cluster.py:147, run_and_check) 2025-11-13 08:32:38.897000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:38.897000 [ 639 ] DEBUG : Stderr: ==== spark-packages: tried (cluster.py:147, run_and_check) 2025-11-13 08:32:38.897000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:38.898000 [ 639 ] DEBUG : Stderr: https://repos.spark-packages.org/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom (cluster.py:147, run_and_check) 2025-11-13 08:32:38.898000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:38.898000 [ 639 ] DEBUG : Stderr: -- artifact io.unitycatalog#unitycatalog-spark_2.12;0.2.0!unitycatalog-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-11-13 08:32:38.898000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:38.898000 [ 639 ] DEBUG : Stderr: https://repos.spark-packages.org/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar (cluster.py:147, run_and_check) 2025-11-13 08:32:38.898000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:38.898000 [ 639 ] DEBUG : Stderr: :::::::::::::::::::::::::::::::::::::::::::::: (cluster.py:147, run_and_check) 2025-11-13 08:32:38.898000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:38.898000 [ 639 ] DEBUG : Stderr: :: UNRESOLVED DEPENDENCIES :: (cluster.py:147, run_and_check) 2025-11-13 08:32:38.898000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:38.898000 [ 639 ] DEBUG : Stderr: :::::::::::::::::::::::::::::::::::::::::::::: (cluster.py:147, run_and_check) 2025-11-13 08:32:38.898000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:38.899000 [ 639 ] DEBUG : Stderr: :: org.apache.hadoop#hadoop-aws;3.3.4: not found (cluster.py:147, run_and_check) 2025-11-13 08:32:38.899000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:38.899000 [ 639 ] DEBUG : Stderr: :: io.delta#delta-spark_2.12;3.2.1: not found (cluster.py:147, run_and_check) 2025-11-13 08:32:38.899000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:38.899000 [ 639 ] DEBUG : Stderr: :: io.unitycatalog#unitycatalog-spark_2.12;0.2.0: not found (cluster.py:147, run_and_check) 2025-11-13 08:32:38.899000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:38.899000 [ 639 ] DEBUG : Stderr: :::::::::::::::::::::::::::::::::::::::::::::: (cluster.py:147, run_and_check) 2025-11-13 08:32:38.899000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:38.899000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:38.899000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:38.899000 [ 639 ] DEBUG : Stderr::: USE VERBOSE OR DEBUG MESSAGE LEVEL FOR MORE DETAILS (cluster.py:147, run_and_check) 2025-11-13 08:32:38.899000 [ 639 ] DEBUG : Stderr:Exception in thread "main" java.lang.RuntimeException: [unresolved dependency: org.apache.hadoop#hadoop-aws;3.3.4: not found, unresolved dependency: io.delta#delta-spark_2.12;3.2.1: not found, unresolved dependency: io.unitycatalog#unitycatalog-spark_2.12;0.2.0: not found] (cluster.py:147, run_and_check) 2025-11-13 08:32:38.900000 [ 639 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmitUtils$.resolveMavenCoordinates(SparkSubmit.scala:1613) (cluster.py:147, run_and_check) 2025-11-13 08:32:38.900000 [ 639 ] DEBUG : Stderr: at org.apache.spark.util.DependencyUtils$.resolveMavenDependencies(DependencyUtils.scala:185) (cluster.py:147, run_and_check) 2025-11-13 08:32:38.900000 [ 639 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.prepareSubmitEnvironment(SparkSubmit.scala:339) (cluster.py:147, run_and_check) 2025-11-13 08:32:38.900000 [ 639 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:969) (cluster.py:147, run_and_check) 2025-11-13 08:32:38.900000 [ 639 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:199) (cluster.py:147, run_and_check) 2025-11-13 08:32:38.900000 [ 639 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:222) (cluster.py:147, run_and_check) 2025-11-13 08:32:38.900000 [ 639 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:91) (cluster.py:147, run_and_check) 2025-11-13 08:32:38.900000 [ 639 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1125) (cluster.py:147, run_and_check) 2025-11-13 08:32:38.900000 [ 639 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1134) (cluster.py:147, run_and_check) 2025-11-13 08:32:38.900000 [ 639 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) (cluster.py:147, run_and_check) 2025-11-13 08:32:38.900000 [ 639 ] DEBUG : Exitcode:1 (cluster.py:149, run_and_check) 2025-11-13 08:32:38.900000 [ 639 ] DEBUG : Executing query create database complex_schema engine DataLakeCatalog('http://localhost:8080/api/2.1/unity-catalog') settings warehouse = 'unity', catalog_type='unity', vended_credentials=false on node1 (cluster.py:3648, query) 2025-11-13 08:32:39.166000 [ 639 ] DEBUG : Executing query SHOW TABLES FROM complex_schema LIKE 'schema_with_complex_tables%' on node1 (cluster.py:3648, query) 2025-11-13 08:32:40.235000 [ 639 ] DEBUG : Executing query SHOW CREATE TABLE complex_schema.`schema_with_complex_tables.complex_table` on node1 (cluster.py:3648, query) ______________________ test_embedded_database_and_tables _______________________ [gw2] linux -- Python 3.10.12 /usr/bin/python3 started_cluster = def test_embedded_database_and_tables(started_cluster): node1 = started_cluster.instances['node1'] node1.query("create database unity_test engine DataLakeCatalog('http://localhost:8080/api/2.1/unity-catalog') settings warehouse = 'unity', catalog_type='unity', vended_credentials=false", settings={"allow_experimental_database_unity_catalog": "1"}) default_tables = list(sorted(node1.query("SHOW TABLES FROM unity_test LIKE 'default%'", settings={'use_hive_partitioning':'0'}).strip().split('\n'))) print("Default tables", default_tables) assert default_tables == ['default.marksheet', 'default.marksheet_uniform', 'default.numbers', 'default.user_countries'] for table in default_tables: if table == "default.marksheet_uniform": continue assert "DeltaLake" in node1.query(f"show create table unity_test.`{table}`") if table in ('default.marksheet', 'default.user_countries'): data_clickhouse = TSV(node1.query(f"SELECT * FROM unity_test.`{table}` ORDER BY 1,2,3")) > data_spark = TSV(execute_spark_query(node1, f"SELECT * FROM unity.{table} ORDER BY 1,2,3")) test_database_delta/test.py:90: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test_database_delta/test.py:54: in execute_spark_query return node.exec_in_container( helpers/cluster.py:4117: in exec_in_container return self.cluster.exec_in_container( helpers/cluster.py:2069: in exec_in_container result = subprocess_check_call( helpers/cluster.py:239: in subprocess_check_call return run_and_check(args, detach=detach, nothrow=nothrow, **kwargs) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ args = ['docker', 'exec', 'roottestdatabasedelta-gw2-node1-1', 'bash', '-c', '\ncd /spark-3.5.4-bin-hadoop3 && bin/spark-sql ...tCatalog=unity" \\\n -S -e "SELECT * FROM unity.default.marksheet ORDER BY 1,2,3" | grep -v \'loading settings\'\n'] env = None, shell = False, stdout = -1, stderr = -1, timeout = 300 nothrow = False, detach = False def run_and_check( args: Union[Sequence[str], str], env=None, shell=False, stdout=subprocess.PIPE, stderr=subprocess.PIPE, timeout=300, nothrow=False, detach=False, ) -> str: if shell: if isinstance(args, str): shell_args = args else: shell_args = next(a for a in args) else: shell_args = " ".join(args) logging.debug("Command:[%s]", shell_args) if detach: subprocess.Popen( args, stdout=subprocess.DEVNULL, stderr=subprocess.DEVNULL, env=env, shell=shell, ) return "" res = subprocess.run( args, stdout=stdout, stderr=stderr, env=env, shell=shell, timeout=timeout, check=False, ) out = res.stdout.decode("utf-8", "ignore") err = res.stderr.decode("utf-8", "ignore") # check_call(...) from subprocess does not print stderr, so we do it manually for outline in out.splitlines(): logging.debug("Stdout:%s", outline) for errline in err.splitlines(): logging.debug("Stderr:%s", errline) if res.returncode != 0: logging.debug("Exitcode:%s", res.returncode) if env: logging.debug("Env:%s", env) if not nothrow: > raise Exception( f"Command [{shell_args}] return non-zero code {res.returncode}: {res.stderr.decode('utf-8')}" ) E Exception: Command [docker exec roottestdatabasedelta-gw2-node1-1 bash -c E cd /spark-3.5.4-bin-hadoop3 && bin/spark-sql --name "s3-uc-test" \ E --master "local[*]" \ E --packages "org.apache.hadoop:hadoop-aws:3.3.4,io.delta:delta-spark_2.12:3.2.1,io.unitycatalog:unitycatalog-spark_2.12:0.2.0" \ E --conf "spark.sql.extensions=io.delta.sql.DeltaSparkSessionExtension" \ E --conf "spark.sql.catalog.spark_catalog=io.unitycatalog.spark.UCSingleCatalog" \ E --conf "spark.hadoop.fs.s3.impl=org.apache.hadoop.fs.s3a.S3AFileSystem" \ E --conf "spark.sql.catalog.unity=io.unitycatalog.spark.UCSingleCatalog" \ E --conf "spark.sql.catalog.unity.uri=http://localhost:8080" \ E --conf "spark.sql.catalog.unity.token=" \ E --conf "spark.sql.defaultCatalog=unity" \ E -S -e "SELECT * FROM unity.default.marksheet ORDER BY 1,2,3" | grep -v 'loading settings' E ] return non-zero code 1: Ivy Default Cache set to: /root/.ivy2/cache E The jars for the packages stored in: /root/.ivy2/jars E org.apache.hadoop#hadoop-aws added as a dependency E io.delta#delta-spark_2.12 added as a dependency E io.unitycatalog#unitycatalog-spark_2.12 added as a dependency E :: resolving dependencies :: org.apache.spark#spark-submit-parent-20641ad8-3863-46f4-8325-d5309770fadd;1.0 E confs: [default] E You probably access the destination server through a proxy server that is not well configured. E You probably access the destination server through a proxy server that is not well configured. E You probably access the destination server through a proxy server that is not well configured. E You probably access the destination server through a proxy server that is not well configured. E You probably access the destination server through a proxy server that is not well configured. E You probably access the destination server through a proxy server that is not well configured. E You probably access the destination server through a proxy server that is not well configured. E You probably access the destination server through a proxy server that is not well configured. E You probably access the destination server through a proxy server that is not well configured. E You probably access the destination server through a proxy server that is not well configured. E You probably access the destination server through a proxy server that is not well configured. E You probably access the destination server through a proxy server that is not well configured. E :: resolution report :: resolve 4246ms :: artifacts dl 0ms E :: modules in use: E --------------------------------------------------------------------- E | | modules || artifacts | E | conf | number| search|dwnlded|evicted|| number|dwnlded| E --------------------------------------------------------------------- E | default | 3 | 0 | 0 | 0 || 0 | 0 | E --------------------------------------------------------------------- E E :: problems summary :: E :::: WARNINGS E Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom E E Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar E E Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom E E Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar E E module not found: org.apache.hadoop#hadoop-aws;3.3.4 E E ==== local-m2-cache: tried E E file:/root/.m2/repository/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom E E -- artifact org.apache.hadoop#hadoop-aws;3.3.4!hadoop-aws.jar: E E file:/root/.m2/repository/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar E E ==== local-ivy-cache: tried E E /root/.ivy2/local/org.apache.hadoop/hadoop-aws/3.3.4/ivys/ivy.xml E E -- artifact org.apache.hadoop#hadoop-aws;3.3.4!hadoop-aws.jar: E E /root/.ivy2/local/org.apache.hadoop/hadoop-aws/3.3.4/jars/hadoop-aws.jar E E ==== central: tried E E https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom E E -- artifact org.apache.hadoop#hadoop-aws;3.3.4!hadoop-aws.jar: E E https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar E E ==== spark-packages: tried E E https://repos.spark-packages.org/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom E E -- artifact org.apache.hadoop#hadoop-aws;3.3.4!hadoop-aws.jar: E E https://repos.spark-packages.org/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar E E Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom E E Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar E E Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom E E Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar E E module not found: io.delta#delta-spark_2.12;3.2.1 E E ==== local-m2-cache: tried E E file:/root/.m2/repository/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom E E -- artifact io.delta#delta-spark_2.12;3.2.1!delta-spark_2.12.jar: E E file:/root/.m2/repository/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar E E ==== local-ivy-cache: tried E E /root/.ivy2/local/io.delta/delta-spark_2.12/3.2.1/ivys/ivy.xml E E -- artifact io.delta#delta-spark_2.12;3.2.1!delta-spark_2.12.jar: E E /root/.ivy2/local/io.delta/delta-spark_2.12/3.2.1/jars/delta-spark_2.12.jar E E ==== central: tried E E https://repo1.maven.org/maven2/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom E E -- artifact io.delta#delta-spark_2.12;3.2.1!delta-spark_2.12.jar: E E https://repo1.maven.org/maven2/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar E E ==== spark-packages: tried E E https://repos.spark-packages.org/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom E E -- artifact io.delta#delta-spark_2.12;3.2.1!delta-spark_2.12.jar: E E https://repos.spark-packages.org/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar E E Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom E E Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar E E Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom E E Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar E E module not found: io.unitycatalog#unitycatalog-spark_2.12;0.2.0 E E ==== local-m2-cache: tried E E file:/root/.m2/repository/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom E E -- artifact io.unitycatalog#unitycatalog-spark_2.12;0.2.0!unitycatalog-spark_2.12.jar: E E file:/root/.m2/repository/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar E E ==== local-ivy-cache: tried E E /root/.ivy2/local/io.unitycatalog/unitycatalog-spark_2.12/0.2.0/ivys/ivy.xml E E -- artifact io.unitycatalog#unitycatalog-spark_2.12;0.2.0!unitycatalog-spark_2.12.jar: E E /root/.ivy2/local/io.unitycatalog/unitycatalog-spark_2.12/0.2.0/jars/unitycatalog-spark_2.12.jar E E ==== central: tried E E https://repo1.maven.org/maven2/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom E E -- artifact io.unitycatalog#unitycatalog-spark_2.12;0.2.0!unitycatalog-spark_2.12.jar: E E https://repo1.maven.org/maven2/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar E E ==== spark-packages: tried E E https://repos.spark-packages.org/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom E E -- artifact io.unitycatalog#unitycatalog-spark_2.12;0.2.0!unitycatalog-spark_2.12.jar: E E https://repos.spark-packages.org/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar E E :::::::::::::::::::::::::::::::::::::::::::::: E E :: UNRESOLVED DEPENDENCIES :: E E :::::::::::::::::::::::::::::::::::::::::::::: E E :: org.apache.hadoop#hadoop-aws;3.3.4: not found E E :: io.delta#delta-spark_2.12;3.2.1: not found E E :: io.unitycatalog#unitycatalog-spark_2.12;0.2.0: not found E E :::::::::::::::::::::::::::::::::::::::::::::: E E E E :: USE VERBOSE OR DEBUG MESSAGE LEVEL FOR MORE DETAILS E Exception in thread "main" java.lang.RuntimeException: [unresolved dependency: org.apache.hadoop#hadoop-aws;3.3.4: not found, unresolved dependency: io.delta#delta-spark_2.12;3.2.1: not found, unresolved dependency: io.unitycatalog#unitycatalog-spark_2.12;0.2.0: not found] E at org.apache.spark.deploy.SparkSubmitUtils$.resolveMavenCoordinates(SparkSubmit.scala:1613) E at org.apache.spark.util.DependencyUtils$.resolveMavenDependencies(DependencyUtils.scala:185) E at org.apache.spark.deploy.SparkSubmit.prepareSubmitEnvironment(SparkSubmit.scala:339) E at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:969) E at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:199) E at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:222) E at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:91) E at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1125) E at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1134) E at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) helpers/cluster.py:153: Exception ----------------------------- Captured stdout call ----------------------------- Default tables ['default.marksheet', 'default.marksheet_uniform', 'default.numbers', 'default.user_countries'] ------------------------------ Captured log call ------------------------------- 2025-11-13 08:32:40.811000 [ 639 ] DEBUG : Executing query create database unity_test engine DataLakeCatalog('http://localhost:8080/api/2.1/unity-catalog') settings warehouse = 'unity', catalog_type='unity', vended_credentials=false on node1 (cluster.py:3648, query) 2025-11-13 08:32:41.077000 [ 639 ] DEBUG : Executing query SHOW TABLES FROM unity_test LIKE 'default%' on node1 (cluster.py:3648, query) 2025-11-13 08:32:41.645000 [ 639 ] DEBUG : Executing query show create table unity_test.`default.marksheet` on node1 (cluster.py:3648, query) 2025-11-13 08:32:41.961000 [ 639 ] DEBUG : Executing query SELECT * FROM unity_test.`default.marksheet` ORDER BY 1,2,3 on node1 (cluster.py:3648, query) 2025-11-13 08:32:42.378000 [ 639 ] DEBUG : run container_id:roottestdatabasedelta-gw2-node1-1 detach:False nothrow:False cmd: ['bash', '-c', '\ncd /spark-3.5.4-bin-hadoop3 && bin/spark-sql --name "s3-uc-test" \\\n --master "local[*]" \\\n --packages "org.apache.hadoop:hadoop-aws:3.3.4,io.delta:delta-spark_2.12:3.2.1,io.unitycatalog:unitycatalog-spark_2.12:0.2.0" \\\n --conf "spark.sql.extensions=io.delta.sql.DeltaSparkSessionExtension" \\\n --conf "spark.sql.catalog.spark_catalog=io.unitycatalog.spark.UCSingleCatalog" \\\n --conf "spark.hadoop.fs.s3.impl=org.apache.hadoop.fs.s3a.S3AFileSystem" \\\n --conf "spark.sql.catalog.unity=io.unitycatalog.spark.UCSingleCatalog" \\\n --conf "spark.sql.catalog.unity.uri=http://localhost:8080" \\\n --conf "spark.sql.catalog.unity.token=" \\\n --conf "spark.sql.defaultCatalog=unity" \\\n -S -e "SELECT * FROM unity.default.marksheet ORDER BY 1,2,3" | grep -v \'loading settings\'\n'] (cluster.py:2051, exec_in_container) 2025-11-13 08:32:42.378000 [ 639 ] DEBUG : Command:[docker exec roottestdatabasedelta-gw2-node1-1 bash -c cd /spark-3.5.4-bin-hadoop3 && bin/spark-sql --name "s3-uc-test" \ --master "local[*]" \ --packages "org.apache.hadoop:hadoop-aws:3.3.4,io.delta:delta-spark_2.12:3.2.1,io.unitycatalog:unitycatalog-spark_2.12:0.2.0" \ --conf "spark.sql.extensions=io.delta.sql.DeltaSparkSessionExtension" \ --conf "spark.sql.catalog.spark_catalog=io.unitycatalog.spark.UCSingleCatalog" \ --conf "spark.hadoop.fs.s3.impl=org.apache.hadoop.fs.s3a.S3AFileSystem" \ --conf "spark.sql.catalog.unity=io.unitycatalog.spark.UCSingleCatalog" \ --conf "spark.sql.catalog.unity.uri=http://localhost:8080" \ --conf "spark.sql.catalog.unity.token=" \ --conf "spark.sql.defaultCatalog=unity" \ -S -e "SELECT * FROM unity.default.marksheet ORDER BY 1,2,3" | grep -v 'loading settings' ] (cluster.py:121, run_and_check) 2025-11-13 08:32:48.235000 [ 639 ] DEBUG : Stderr:Ivy Default Cache set to: /root/.ivy2/cache (cluster.py:147, run_and_check) 2025-11-13 08:32:48.236000 [ 639 ] DEBUG : Stderr:The jars for the packages stored in: /root/.ivy2/jars (cluster.py:147, run_and_check) 2025-11-13 08:32:48.236000 [ 639 ] DEBUG : Stderr:org.apache.hadoop#hadoop-aws added as a dependency (cluster.py:147, run_and_check) 2025-11-13 08:32:48.236000 [ 639 ] DEBUG : Stderr:io.delta#delta-spark_2.12 added as a dependency (cluster.py:147, run_and_check) 2025-11-13 08:32:48.236000 [ 639 ] DEBUG : Stderr:io.unitycatalog#unitycatalog-spark_2.12 added as a dependency (cluster.py:147, run_and_check) 2025-11-13 08:32:48.236000 [ 639 ] DEBUG : Stderr::: resolving dependencies :: org.apache.spark#spark-submit-parent-20641ad8-3863-46f4-8325-d5309770fadd;1.0 (cluster.py:147, run_and_check) 2025-11-13 08:32:48.236000 [ 639 ] DEBUG : Stderr: confs: [default] (cluster.py:147, run_and_check) 2025-11-13 08:32:48.237000 [ 639 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:32:48.237000 [ 639 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:32:48.237000 [ 639 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:32:48.237000 [ 639 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:32:48.237000 [ 639 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:32:48.237000 [ 639 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:32:48.237000 [ 639 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:32:48.237000 [ 639 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:32:48.237000 [ 639 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:32:48.237000 [ 639 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:32:48.237000 [ 639 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:32:48.237000 [ 639 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:32:48.238000 [ 639 ] DEBUG : Stderr::: resolution report :: resolve 4246ms :: artifacts dl 0ms (cluster.py:147, run_and_check) 2025-11-13 08:32:48.238000 [ 639 ] DEBUG : Stderr: :: modules in use: (cluster.py:147, run_and_check) 2025-11-13 08:32:48.238000 [ 639 ] DEBUG : Stderr: --------------------------------------------------------------------- (cluster.py:147, run_and_check) 2025-11-13 08:32:48.238000 [ 639 ] DEBUG : Stderr: | | modules || artifacts | (cluster.py:147, run_and_check) 2025-11-13 08:32:48.238000 [ 639 ] DEBUG : Stderr: | conf | number| search|dwnlded|evicted|| number|dwnlded| (cluster.py:147, run_and_check) 2025-11-13 08:32:48.238000 [ 639 ] DEBUG : Stderr: --------------------------------------------------------------------- (cluster.py:147, run_and_check) 2025-11-13 08:32:48.238000 [ 639 ] DEBUG : Stderr: | default | 3 | 0 | 0 | 0 || 0 | 0 | (cluster.py:147, run_and_check) 2025-11-13 08:32:48.238000 [ 639 ] DEBUG : Stderr: --------------------------------------------------------------------- (cluster.py:147, run_and_check) 2025-11-13 08:32:48.238000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:48.238000 [ 639 ] DEBUG : Stderr::: problems summary :: (cluster.py:147, run_and_check) 2025-11-13 08:32:48.238000 [ 639 ] DEBUG : Stderr::::: WARNINGS (cluster.py:147, run_and_check) 2025-11-13 08:32:48.238000 [ 639 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom (cluster.py:147, run_and_check) 2025-11-13 08:32:48.238000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:48.239000 [ 639 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar (cluster.py:147, run_and_check) 2025-11-13 08:32:48.239000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:48.239000 [ 639 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom (cluster.py:147, run_and_check) 2025-11-13 08:32:48.239000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:48.239000 [ 639 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar (cluster.py:147, run_and_check) 2025-11-13 08:32:48.239000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:48.239000 [ 639 ] DEBUG : Stderr: module not found: org.apache.hadoop#hadoop-aws;3.3.4 (cluster.py:147, run_and_check) 2025-11-13 08:32:48.239000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:48.239000 [ 639 ] DEBUG : Stderr: ==== local-m2-cache: tried (cluster.py:147, run_and_check) 2025-11-13 08:32:48.239000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:48.239000 [ 639 ] DEBUG : Stderr: file:/root/.m2/repository/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom (cluster.py:147, run_and_check) 2025-11-13 08:32:48.239000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:48.239000 [ 639 ] DEBUG : Stderr: -- artifact org.apache.hadoop#hadoop-aws;3.3.4!hadoop-aws.jar: (cluster.py:147, run_and_check) 2025-11-13 08:32:48.239000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:48.240000 [ 639 ] DEBUG : Stderr: file:/root/.m2/repository/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar (cluster.py:147, run_and_check) 2025-11-13 08:32:48.240000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:48.240000 [ 639 ] DEBUG : Stderr: ==== local-ivy-cache: tried (cluster.py:147, run_and_check) 2025-11-13 08:32:48.240000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:48.240000 [ 639 ] DEBUG : Stderr: /root/.ivy2/local/org.apache.hadoop/hadoop-aws/3.3.4/ivys/ivy.xml (cluster.py:147, run_and_check) 2025-11-13 08:32:48.240000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:48.240000 [ 639 ] DEBUG : Stderr: -- artifact org.apache.hadoop#hadoop-aws;3.3.4!hadoop-aws.jar: (cluster.py:147, run_and_check) 2025-11-13 08:32:48.240000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:48.240000 [ 639 ] DEBUG : Stderr: /root/.ivy2/local/org.apache.hadoop/hadoop-aws/3.3.4/jars/hadoop-aws.jar (cluster.py:147, run_and_check) 2025-11-13 08:32:48.240000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:48.240000 [ 639 ] DEBUG : Stderr: ==== central: tried (cluster.py:147, run_and_check) 2025-11-13 08:32:48.241000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:48.241000 [ 639 ] DEBUG : Stderr: https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom (cluster.py:147, run_and_check) 2025-11-13 08:32:48.241000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:48.241000 [ 639 ] DEBUG : Stderr: -- artifact org.apache.hadoop#hadoop-aws;3.3.4!hadoop-aws.jar: (cluster.py:147, run_and_check) 2025-11-13 08:32:48.241000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:48.241000 [ 639 ] DEBUG : Stderr: https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar (cluster.py:147, run_and_check) 2025-11-13 08:32:48.241000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:48.241000 [ 639 ] DEBUG : Stderr: ==== spark-packages: tried (cluster.py:147, run_and_check) 2025-11-13 08:32:48.241000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:48.241000 [ 639 ] DEBUG : Stderr: https://repos.spark-packages.org/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom (cluster.py:147, run_and_check) 2025-11-13 08:32:48.242000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:48.242000 [ 639 ] DEBUG : Stderr: -- artifact org.apache.hadoop#hadoop-aws;3.3.4!hadoop-aws.jar: (cluster.py:147, run_and_check) 2025-11-13 08:32:48.242000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:48.242000 [ 639 ] DEBUG : Stderr: https://repos.spark-packages.org/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar (cluster.py:147, run_and_check) 2025-11-13 08:32:48.242000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:48.242000 [ 639 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom (cluster.py:147, run_and_check) 2025-11-13 08:32:48.242000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:48.242000 [ 639 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar (cluster.py:147, run_and_check) 2025-11-13 08:32:48.242000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:48.242000 [ 639 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom (cluster.py:147, run_and_check) 2025-11-13 08:32:48.242000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:48.243000 [ 639 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar (cluster.py:147, run_and_check) 2025-11-13 08:32:48.243000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:48.243000 [ 639 ] DEBUG : Stderr: module not found: io.delta#delta-spark_2.12;3.2.1 (cluster.py:147, run_and_check) 2025-11-13 08:32:48.243000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:48.243000 [ 639 ] DEBUG : Stderr: ==== local-m2-cache: tried (cluster.py:147, run_and_check) 2025-11-13 08:32:48.243000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:48.243000 [ 639 ] DEBUG : Stderr: file:/root/.m2/repository/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom (cluster.py:147, run_and_check) 2025-11-13 08:32:48.243000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:48.243000 [ 639 ] DEBUG : Stderr: -- artifact io.delta#delta-spark_2.12;3.2.1!delta-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-11-13 08:32:48.243000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:48.244000 [ 639 ] DEBUG : Stderr: file:/root/.m2/repository/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar (cluster.py:147, run_and_check) 2025-11-13 08:32:48.244000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:48.244000 [ 639 ] DEBUG : Stderr: ==== local-ivy-cache: tried (cluster.py:147, run_and_check) 2025-11-13 08:32:48.244000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:48.244000 [ 639 ] DEBUG : Stderr: /root/.ivy2/local/io.delta/delta-spark_2.12/3.2.1/ivys/ivy.xml (cluster.py:147, run_and_check) 2025-11-13 08:32:48.244000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:48.244000 [ 639 ] DEBUG : Stderr: -- artifact io.delta#delta-spark_2.12;3.2.1!delta-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-11-13 08:32:48.244000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:48.244000 [ 639 ] DEBUG : Stderr: /root/.ivy2/local/io.delta/delta-spark_2.12/3.2.1/jars/delta-spark_2.12.jar (cluster.py:147, run_and_check) 2025-11-13 08:32:48.244000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:48.244000 [ 639 ] DEBUG : Stderr: ==== central: tried (cluster.py:147, run_and_check) 2025-11-13 08:32:48.244000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:48.245000 [ 639 ] DEBUG : Stderr: https://repo1.maven.org/maven2/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom (cluster.py:147, run_and_check) 2025-11-13 08:32:48.245000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:48.245000 [ 639 ] DEBUG : Stderr: -- artifact io.delta#delta-spark_2.12;3.2.1!delta-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-11-13 08:32:48.245000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:48.245000 [ 639 ] DEBUG : Stderr: https://repo1.maven.org/maven2/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar (cluster.py:147, run_and_check) 2025-11-13 08:32:48.245000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:48.245000 [ 639 ] DEBUG : Stderr: ==== spark-packages: tried (cluster.py:147, run_and_check) 2025-11-13 08:32:48.245000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:48.245000 [ 639 ] DEBUG : Stderr: https://repos.spark-packages.org/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom (cluster.py:147, run_and_check) 2025-11-13 08:32:48.245000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:48.245000 [ 639 ] DEBUG : Stderr: -- artifact io.delta#delta-spark_2.12;3.2.1!delta-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-11-13 08:32:48.246000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:48.246000 [ 639 ] DEBUG : Stderr: https://repos.spark-packages.org/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar (cluster.py:147, run_and_check) 2025-11-13 08:32:48.246000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:48.246000 [ 639 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom (cluster.py:147, run_and_check) 2025-11-13 08:32:48.246000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:48.246000 [ 639 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar (cluster.py:147, run_and_check) 2025-11-13 08:32:48.246000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:48.246000 [ 639 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom (cluster.py:147, run_and_check) 2025-11-13 08:32:48.246000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:48.246000 [ 639 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar (cluster.py:147, run_and_check) 2025-11-13 08:32:48.247000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:48.247000 [ 639 ] DEBUG : Stderr: module not found: io.unitycatalog#unitycatalog-spark_2.12;0.2.0 (cluster.py:147, run_and_check) 2025-11-13 08:32:48.247000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:48.247000 [ 639 ] DEBUG : Stderr: ==== local-m2-cache: tried (cluster.py:147, run_and_check) 2025-11-13 08:32:48.247000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:48.247000 [ 639 ] DEBUG : Stderr: file:/root/.m2/repository/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom (cluster.py:147, run_and_check) 2025-11-13 08:32:48.247000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:48.247000 [ 639 ] DEBUG : Stderr: -- artifact io.unitycatalog#unitycatalog-spark_2.12;0.2.0!unitycatalog-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-11-13 08:32:48.247000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:48.247000 [ 639 ] DEBUG : Stderr: file:/root/.m2/repository/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar (cluster.py:147, run_and_check) 2025-11-13 08:32:48.247000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:48.247000 [ 639 ] DEBUG : Stderr: ==== local-ivy-cache: tried (cluster.py:147, run_and_check) 2025-11-13 08:32:48.247000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:48.248000 [ 639 ] DEBUG : Stderr: /root/.ivy2/local/io.unitycatalog/unitycatalog-spark_2.12/0.2.0/ivys/ivy.xml (cluster.py:147, run_and_check) 2025-11-13 08:32:48.248000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:48.248000 [ 639 ] DEBUG : Stderr: -- artifact io.unitycatalog#unitycatalog-spark_2.12;0.2.0!unitycatalog-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-11-13 08:32:48.248000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:48.248000 [ 639 ] DEBUG : Stderr: /root/.ivy2/local/io.unitycatalog/unitycatalog-spark_2.12/0.2.0/jars/unitycatalog-spark_2.12.jar (cluster.py:147, run_and_check) 2025-11-13 08:32:48.248000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:48.248000 [ 639 ] DEBUG : Stderr: ==== central: tried (cluster.py:147, run_and_check) 2025-11-13 08:32:48.248000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:48.248000 [ 639 ] DEBUG : Stderr: https://repo1.maven.org/maven2/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom (cluster.py:147, run_and_check) 2025-11-13 08:32:48.248000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:48.249000 [ 639 ] DEBUG : Stderr: -- artifact io.unitycatalog#unitycatalog-spark_2.12;0.2.0!unitycatalog-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-11-13 08:32:48.249000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:48.249000 [ 639 ] DEBUG : Stderr: https://repo1.maven.org/maven2/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar (cluster.py:147, run_and_check) 2025-11-13 08:32:48.249000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:48.249000 [ 639 ] DEBUG : Stderr: ==== spark-packages: tried (cluster.py:147, run_and_check) 2025-11-13 08:32:48.249000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:48.249000 [ 639 ] DEBUG : Stderr: https://repos.spark-packages.org/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom (cluster.py:147, run_and_check) 2025-11-13 08:32:48.249000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:48.249000 [ 639 ] DEBUG : Stderr: -- artifact io.unitycatalog#unitycatalog-spark_2.12;0.2.0!unitycatalog-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-11-13 08:32:48.250000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:48.250000 [ 639 ] DEBUG : Stderr: https://repos.spark-packages.org/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar (cluster.py:147, run_and_check) 2025-11-13 08:32:48.250000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:48.250000 [ 639 ] DEBUG : Stderr: :::::::::::::::::::::::::::::::::::::::::::::: (cluster.py:147, run_and_check) 2025-11-13 08:32:48.250000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:48.250000 [ 639 ] DEBUG : Stderr: :: UNRESOLVED DEPENDENCIES :: (cluster.py:147, run_and_check) 2025-11-13 08:32:48.251000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:48.251000 [ 639 ] DEBUG : Stderr: :::::::::::::::::::::::::::::::::::::::::::::: (cluster.py:147, run_and_check) 2025-11-13 08:32:48.251000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:48.251000 [ 639 ] DEBUG : Stderr: :: org.apache.hadoop#hadoop-aws;3.3.4: not found (cluster.py:147, run_and_check) 2025-11-13 08:32:48.251000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:48.251000 [ 639 ] DEBUG : Stderr: :: io.delta#delta-spark_2.12;3.2.1: not found (cluster.py:147, run_and_check) 2025-11-13 08:32:48.251000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:48.251000 [ 639 ] DEBUG : Stderr: :: io.unitycatalog#unitycatalog-spark_2.12;0.2.0: not found (cluster.py:147, run_and_check) 2025-11-13 08:32:48.252000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:48.252000 [ 639 ] DEBUG : Stderr: :::::::::::::::::::::::::::::::::::::::::::::: (cluster.py:147, run_and_check) 2025-11-13 08:32:48.252000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:48.252000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:48.252000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:48.252000 [ 639 ] DEBUG : Stderr::: USE VERBOSE OR DEBUG MESSAGE LEVEL FOR MORE DETAILS (cluster.py:147, run_and_check) 2025-11-13 08:32:48.252000 [ 639 ] DEBUG : Stderr:Exception in thread "main" java.lang.RuntimeException: [unresolved dependency: org.apache.hadoop#hadoop-aws;3.3.4: not found, unresolved dependency: io.delta#delta-spark_2.12;3.2.1: not found, unresolved dependency: io.unitycatalog#unitycatalog-spark_2.12;0.2.0: not found] (cluster.py:147, run_and_check) 2025-11-13 08:32:48.252000 [ 639 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmitUtils$.resolveMavenCoordinates(SparkSubmit.scala:1613) (cluster.py:147, run_and_check) 2025-11-13 08:32:48.252000 [ 639 ] DEBUG : Stderr: at org.apache.spark.util.DependencyUtils$.resolveMavenDependencies(DependencyUtils.scala:185) (cluster.py:147, run_and_check) 2025-11-13 08:32:48.253000 [ 639 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.prepareSubmitEnvironment(SparkSubmit.scala:339) (cluster.py:147, run_and_check) 2025-11-13 08:32:48.253000 [ 639 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:969) (cluster.py:147, run_and_check) 2025-11-13 08:32:48.253000 [ 639 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:199) (cluster.py:147, run_and_check) 2025-11-13 08:32:48.253000 [ 639 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:222) (cluster.py:147, run_and_check) 2025-11-13 08:32:48.253000 [ 639 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:91) (cluster.py:147, run_and_check) 2025-11-13 08:32:48.253000 [ 639 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1125) (cluster.py:147, run_and_check) 2025-11-13 08:32:48.253000 [ 639 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1134) (cluster.py:147, run_and_check) 2025-11-13 08:32:48.253000 [ 639 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) (cluster.py:147, run_and_check) 2025-11-13 08:32:48.253000 [ 639 ] DEBUG : Exitcode:1 (cluster.py:149, run_and_check) _________________________ test_multiple_schemes_tables _________________________ [gw2] linux -- Python 3.10.12 /usr/bin/python3 started_cluster = def test_multiple_schemes_tables(started_cluster): node1 = started_cluster.instances['node1'] execute_multiple_spark_queries(node1, [f'CREATE SCHEMA test_schema{i}' for i in range(10)], True) execute_multiple_spark_queries(node1, [f'CREATE TABLE test_schema{i}.test_table{i} (col1 int, col2 double) using Delta location \'/tmp/test_schema{i}/test_table{i}\'' for i in range(10)], True) execute_multiple_spark_queries(node1, [f'INSERT INTO test_schema{i}.test_table{i} VALUES ({i}, {i}.0)' for i in range(10)], True) node1.query("create database multi_schema_test engine DataLakeCatalog('http://localhost:8080/api/2.1/unity-catalog') settings warehouse = 'unity', catalog_type='unity', vended_credentials=false", settings={"allow_experimental_database_unity_catalog": "1"}) multi_schema_tables = list(sorted(node1.query("SHOW TABLES FROM multi_schema_test LIKE 'test_schema%'", settings={'use_hive_partitioning':'0'}).strip().split('\n'))) print(multi_schema_tables) for i, table in enumerate(multi_schema_tables): > assert node1.query(f"SELECT col1 FROM multi_schema_test.`{table}`").strip() == str(i) test_database_delta/test.py:107: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ helpers/cluster.py:3649: in query return self.client.query( helpers/client.py:39: in wrap return func(self, *args, **kwargs) helpers/client.py:79: in query ).get_answer() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def get_answer(self): self.process.wait(timeout=DEFAULT_QUERY_TIMEOUT) self.stdout_file.seek(0) self.stderr_file.seek(0) stdout = self.stdout_file.read().decode("utf-8", errors="replace") stderr = self.stderr_file.read().decode("utf-8", errors="replace") if ( self.timer is not None and not self.process_finished_before_timeout and not self.ignore_error ): logging.debug(f"Timed out. Last stdout:{stdout}, stderr:{stderr}") raise QueryTimeoutExceedException("Client timed out!") if ( self.process.returncode != 0 or self.remove_trash_from_stderr(stderr) ) and not self.ignore_error: > raise QueryRuntimeException( "Client failed! Return code: {}, stderr: {}".format( self.process.returncode, stderr ), self.process.returncode, stderr, ) E helpers.client.QueryRuntimeException: Client failed! Return code: 62, stderr: Code: 62. DB::Exception: Syntax error: failed at position 36 (``): ``. Expected one of: Colon, Caret, identifier, end of query. (SYNTAX_ERROR), Stack trace (when copying this message, always include the lines below): E E 0. ./contrib/llvm-project/libcxx/include/__exception/exception.h:113: Poco::Exception::Exception(String const&, int) @ 0x0000000020e10280 E 1. ./build_docker/./src/Common/Exception.cpp:108: DB::Exception::Exception(DB::Exception::MessageMasked&&, int, bool) @ 0x0000000010b395b4 E 2. DB::Exception::createDeprecated(String const&, int, bool) @ 0x000000000881e623 E 3. ./build_docker/./src/Parsers/parseQuery.cpp:411: DB::parseQueryAndMovePosition(DB::IParser&, char const*&, char const*, String const&, bool, unsigned long, unsigned long, unsigned long) @ 0x000000001d065fd0 E 4. ./build_docker/./src/Client/ClientBase.cpp:402: DB::ClientBase::parseQuery(char const*&, char const*, DB::Settings const&, bool) @ 0x000000001ba10cf8 E 5. ./build_docker/./src/Client/ClientBase.cpp:2369: DB::ClientBase::analyzeMultiQueryText(char const*&, char const*&, char const*, String&, std::shared_ptr&, String const&, std::unique_ptr>&) @ 0x000000001ba25ea2 E 6. ./build_docker/./src/Client/ClientBase.cpp:2507: DB::ClientBase::executeMultiQuery(String const&) @ 0x000000001ba26ac6 E 7. ./build_docker/./src/Client/ClientBase.cpp:2776: DB::ClientBase::processQueryText(String const&) @ 0x000000001ba28970 E 8. ./build_docker/./src/Client/ClientBase.cpp:3429: DB::ClientBase::runNonInteractive() @ 0x000000001ba3322b E 9. ./build_docker/./programs/client/Client.cpp:407: DB::Client::main(std::vector> const&) @ 0x0000000010e0619e E 10. ./build_docker/./programs/client/Client.cpp:0: non-virtual thunk to DB::Client::main(std::vector> const&) @ 0x0000000010e06941 E 11. ./build_docker/./base/poco/Util/src/Application.cpp:315: Poco::Util::Application::run() @ 0x0000000020f545df E 12. ./build_docker/./programs/client/Client.cpp:1141: mainEntryClickHouseClient(int, char**) @ 0x0000000010e14b47 E 13. ./build_docker/./programs/main.cpp:295: main @ 0x00000000087e4501 E 14. ? @ 0x00007f316d5f2d90 E 15. ? @ 0x00007f316d5f2e40 E 16. _start @ 0x000000000873702e helpers/client.py:248: QueryRuntimeException ----------------------------- Captured stdout call ----------------------------- [''] ------------------------------ Captured log call ------------------------------- 2025-11-13 08:32:48.408000 [ 639 ] DEBUG : run container_id:roottestdatabasedelta-gw2-node1-1 detach:False nothrow:True cmd: ['bash', '-c', '\ncd /spark-3.5.4-bin-hadoop3 && bin/spark-sql --name "s3-uc-test" \\\n --master "local[*]" \\\n --packages "org.apache.hadoop:hadoop-aws:3.3.4,io.delta:delta-spark_2.12:3.2.1,io.unitycatalog:unitycatalog-spark_2.12:0.2.0" \\\n --conf "spark.sql.extensions=io.delta.sql.DeltaSparkSessionExtension" \\\n --conf "spark.sql.catalog.spark_catalog=io.unitycatalog.spark.UCSingleCatalog" \\\n --conf "spark.hadoop.fs.s3.impl=org.apache.hadoop.fs.s3a.S3AFileSystem" \\\n --conf "spark.sql.catalog.unity=io.unitycatalog.spark.UCSingleCatalog" \\\n --conf "spark.sql.catalog.unity.uri=http://localhost:8080" \\\n --conf "spark.sql.catalog.unity.token=" \\\n --conf "spark.sql.defaultCatalog=unity" \\\n -S -e "CREATE SCHEMA test_schema0;CREATE SCHEMA test_schema1;CREATE SCHEMA test_schema2;CREATE SCHEMA test_schema3;CREATE SCHEMA test_schema4;CREATE SCHEMA test_schema5;CREATE SCHEMA test_schema6;CREATE SCHEMA test_schema7;CREATE SCHEMA test_schema8;CREATE SCHEMA test_schema9" | grep -v \'loading settings\'\n'] (cluster.py:2051, exec_in_container) 2025-11-13 08:32:48.408000 [ 639 ] DEBUG : Command:[docker exec roottestdatabasedelta-gw2-node1-1 bash -c cd /spark-3.5.4-bin-hadoop3 && bin/spark-sql --name "s3-uc-test" \ --master "local[*]" \ --packages "org.apache.hadoop:hadoop-aws:3.3.4,io.delta:delta-spark_2.12:3.2.1,io.unitycatalog:unitycatalog-spark_2.12:0.2.0" \ --conf "spark.sql.extensions=io.delta.sql.DeltaSparkSessionExtension" \ --conf "spark.sql.catalog.spark_catalog=io.unitycatalog.spark.UCSingleCatalog" \ --conf "spark.hadoop.fs.s3.impl=org.apache.hadoop.fs.s3a.S3AFileSystem" \ --conf "spark.sql.catalog.unity=io.unitycatalog.spark.UCSingleCatalog" \ --conf "spark.sql.catalog.unity.uri=http://localhost:8080" \ --conf "spark.sql.catalog.unity.token=" \ --conf "spark.sql.defaultCatalog=unity" \ -S -e "CREATE SCHEMA test_schema0;CREATE SCHEMA test_schema1;CREATE SCHEMA test_schema2;CREATE SCHEMA test_schema3;CREATE SCHEMA test_schema4;CREATE SCHEMA test_schema5;CREATE SCHEMA test_schema6;CREATE SCHEMA test_schema7;CREATE SCHEMA test_schema8;CREATE SCHEMA test_schema9" | grep -v 'loading settings' ] (cluster.py:121, run_and_check) 2025-11-13 08:32:54.033000 [ 639 ] DEBUG : Stderr:Ivy Default Cache set to: /root/.ivy2/cache (cluster.py:147, run_and_check) 2025-11-13 08:32:54.034000 [ 639 ] DEBUG : Stderr:The jars for the packages stored in: /root/.ivy2/jars (cluster.py:147, run_and_check) 2025-11-13 08:32:54.034000 [ 639 ] DEBUG : Stderr:org.apache.hadoop#hadoop-aws added as a dependency (cluster.py:147, run_and_check) 2025-11-13 08:32:54.034000 [ 639 ] DEBUG : Stderr:io.delta#delta-spark_2.12 added as a dependency (cluster.py:147, run_and_check) 2025-11-13 08:32:54.034000 [ 639 ] DEBUG : Stderr:io.unitycatalog#unitycatalog-spark_2.12 added as a dependency (cluster.py:147, run_and_check) 2025-11-13 08:32:54.034000 [ 639 ] DEBUG : Stderr::: resolving dependencies :: org.apache.spark#spark-submit-parent-483c4a74-8509-44e8-ae0a-b95d293e034a;1.0 (cluster.py:147, run_and_check) 2025-11-13 08:32:54.034000 [ 639 ] DEBUG : Stderr: confs: [default] (cluster.py:147, run_and_check) 2025-11-13 08:32:54.034000 [ 639 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:32:54.034000 [ 639 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:32:54.034000 [ 639 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:32:54.034000 [ 639 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:32:54.034000 [ 639 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:32:54.034000 [ 639 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:32:54.035000 [ 639 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:32:54.035000 [ 639 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:32:54.035000 [ 639 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:32:54.035000 [ 639 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:32:54.035000 [ 639 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:32:54.035000 [ 639 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:32:54.035000 [ 639 ] DEBUG : Stderr::: resolution report :: resolve 4221ms :: artifacts dl 0ms (cluster.py:147, run_and_check) 2025-11-13 08:32:54.035000 [ 639 ] DEBUG : Stderr: :: modules in use: (cluster.py:147, run_and_check) 2025-11-13 08:32:54.035000 [ 639 ] DEBUG : Stderr: --------------------------------------------------------------------- (cluster.py:147, run_and_check) 2025-11-13 08:32:54.035000 [ 639 ] DEBUG : Stderr: | | modules || artifacts | (cluster.py:147, run_and_check) 2025-11-13 08:32:54.035000 [ 639 ] DEBUG : Stderr: | conf | number| search|dwnlded|evicted|| number|dwnlded| (cluster.py:147, run_and_check) 2025-11-13 08:32:54.035000 [ 639 ] DEBUG : Stderr: --------------------------------------------------------------------- (cluster.py:147, run_and_check) 2025-11-13 08:32:54.035000 [ 639 ] DEBUG : Stderr: | default | 3 | 0 | 0 | 0 || 0 | 0 | (cluster.py:147, run_and_check) 2025-11-13 08:32:54.035000 [ 639 ] DEBUG : Stderr: --------------------------------------------------------------------- (cluster.py:147, run_and_check) 2025-11-13 08:32:54.036000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:54.036000 [ 639 ] DEBUG : Stderr::: problems summary :: (cluster.py:147, run_and_check) 2025-11-13 08:32:54.036000 [ 639 ] DEBUG : Stderr::::: WARNINGS (cluster.py:147, run_and_check) 2025-11-13 08:32:54.036000 [ 639 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom (cluster.py:147, run_and_check) 2025-11-13 08:32:54.036000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:54.036000 [ 639 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar (cluster.py:147, run_and_check) 2025-11-13 08:32:54.036000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:54.036000 [ 639 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom (cluster.py:147, run_and_check) 2025-11-13 08:32:54.036000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:54.036000 [ 639 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar (cluster.py:147, run_and_check) 2025-11-13 08:32:54.036000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:54.036000 [ 639 ] DEBUG : Stderr: module not found: org.apache.hadoop#hadoop-aws;3.3.4 (cluster.py:147, run_and_check) 2025-11-13 08:32:54.036000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:54.036000 [ 639 ] DEBUG : Stderr: ==== local-m2-cache: tried (cluster.py:147, run_and_check) 2025-11-13 08:32:54.037000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:54.037000 [ 639 ] DEBUG : Stderr: file:/root/.m2/repository/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom (cluster.py:147, run_and_check) 2025-11-13 08:32:54.037000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:54.037000 [ 639 ] DEBUG : Stderr: -- artifact org.apache.hadoop#hadoop-aws;3.3.4!hadoop-aws.jar: (cluster.py:147, run_and_check) 2025-11-13 08:32:54.037000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:54.037000 [ 639 ] DEBUG : Stderr: file:/root/.m2/repository/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar (cluster.py:147, run_and_check) 2025-11-13 08:32:54.037000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:54.037000 [ 639 ] DEBUG : Stderr: ==== local-ivy-cache: tried (cluster.py:147, run_and_check) 2025-11-13 08:32:54.037000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:54.037000 [ 639 ] DEBUG : Stderr: /root/.ivy2/local/org.apache.hadoop/hadoop-aws/3.3.4/ivys/ivy.xml (cluster.py:147, run_and_check) 2025-11-13 08:32:54.037000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:54.037000 [ 639 ] DEBUG : Stderr: -- artifact org.apache.hadoop#hadoop-aws;3.3.4!hadoop-aws.jar: (cluster.py:147, run_and_check) 2025-11-13 08:32:54.037000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:54.038000 [ 639 ] DEBUG : Stderr: /root/.ivy2/local/org.apache.hadoop/hadoop-aws/3.3.4/jars/hadoop-aws.jar (cluster.py:147, run_and_check) 2025-11-13 08:32:54.038000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:54.038000 [ 639 ] DEBUG : Stderr: ==== central: tried (cluster.py:147, run_and_check) 2025-11-13 08:32:54.038000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:54.038000 [ 639 ] DEBUG : Stderr: https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom (cluster.py:147, run_and_check) 2025-11-13 08:32:54.038000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:54.038000 [ 639 ] DEBUG : Stderr: -- artifact org.apache.hadoop#hadoop-aws;3.3.4!hadoop-aws.jar: (cluster.py:147, run_and_check) 2025-11-13 08:32:54.038000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:54.038000 [ 639 ] DEBUG : Stderr: https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar (cluster.py:147, run_and_check) 2025-11-13 08:32:54.038000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:54.038000 [ 639 ] DEBUG : Stderr: ==== spark-packages: tried (cluster.py:147, run_and_check) 2025-11-13 08:32:54.038000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:54.038000 [ 639 ] DEBUG : Stderr: https://repos.spark-packages.org/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom (cluster.py:147, run_and_check) 2025-11-13 08:32:54.038000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:54.039000 [ 639 ] DEBUG : Stderr: -- artifact org.apache.hadoop#hadoop-aws;3.3.4!hadoop-aws.jar: (cluster.py:147, run_and_check) 2025-11-13 08:32:54.039000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:54.039000 [ 639 ] DEBUG : Stderr: https://repos.spark-packages.org/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar (cluster.py:147, run_and_check) 2025-11-13 08:32:54.039000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:54.039000 [ 639 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom (cluster.py:147, run_and_check) 2025-11-13 08:32:54.039000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:54.039000 [ 639 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar (cluster.py:147, run_and_check) 2025-11-13 08:32:54.039000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:54.039000 [ 639 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom (cluster.py:147, run_and_check) 2025-11-13 08:32:54.039000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:54.039000 [ 639 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar (cluster.py:147, run_and_check) 2025-11-13 08:32:54.039000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:54.040000 [ 639 ] DEBUG : Stderr: module not found: io.delta#delta-spark_2.12;3.2.1 (cluster.py:147, run_and_check) 2025-11-13 08:32:54.040000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:54.040000 [ 639 ] DEBUG : Stderr: ==== local-m2-cache: tried (cluster.py:147, run_and_check) 2025-11-13 08:32:54.040000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:54.040000 [ 639 ] DEBUG : Stderr: file:/root/.m2/repository/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom (cluster.py:147, run_and_check) 2025-11-13 08:32:54.040000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:54.040000 [ 639 ] DEBUG : Stderr: -- artifact io.delta#delta-spark_2.12;3.2.1!delta-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-11-13 08:32:54.040000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:54.041000 [ 639 ] DEBUG : Stderr: file:/root/.m2/repository/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar (cluster.py:147, run_and_check) 2025-11-13 08:32:54.041000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:54.041000 [ 639 ] DEBUG : Stderr: ==== local-ivy-cache: tried (cluster.py:147, run_and_check) 2025-11-13 08:32:54.041000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:54.041000 [ 639 ] DEBUG : Stderr: /root/.ivy2/local/io.delta/delta-spark_2.12/3.2.1/ivys/ivy.xml (cluster.py:147, run_and_check) 2025-11-13 08:32:54.041000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:54.041000 [ 639 ] DEBUG : Stderr: -- artifact io.delta#delta-spark_2.12;3.2.1!delta-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-11-13 08:32:54.041000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:54.041000 [ 639 ] DEBUG : Stderr: /root/.ivy2/local/io.delta/delta-spark_2.12/3.2.1/jars/delta-spark_2.12.jar (cluster.py:147, run_and_check) 2025-11-13 08:32:54.041000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:54.041000 [ 639 ] DEBUG : Stderr: ==== central: tried (cluster.py:147, run_and_check) 2025-11-13 08:32:54.041000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:54.041000 [ 639 ] DEBUG : Stderr: https://repo1.maven.org/maven2/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom (cluster.py:147, run_and_check) 2025-11-13 08:32:54.042000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:54.042000 [ 639 ] DEBUG : Stderr: -- artifact io.delta#delta-spark_2.12;3.2.1!delta-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-11-13 08:32:54.042000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:54.042000 [ 639 ] DEBUG : Stderr: https://repo1.maven.org/maven2/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar (cluster.py:147, run_and_check) 2025-11-13 08:32:54.042000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:54.042000 [ 639 ] DEBUG : Stderr: ==== spark-packages: tried (cluster.py:147, run_and_check) 2025-11-13 08:32:54.042000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:54.042000 [ 639 ] DEBUG : Stderr: https://repos.spark-packages.org/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom (cluster.py:147, run_and_check) 2025-11-13 08:32:54.042000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:54.042000 [ 639 ] DEBUG : Stderr: -- artifact io.delta#delta-spark_2.12;3.2.1!delta-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-11-13 08:32:54.042000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:54.042000 [ 639 ] DEBUG : Stderr: https://repos.spark-packages.org/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar (cluster.py:147, run_and_check) 2025-11-13 08:32:54.042000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:54.042000 [ 639 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom (cluster.py:147, run_and_check) 2025-11-13 08:32:54.043000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:54.043000 [ 639 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar (cluster.py:147, run_and_check) 2025-11-13 08:32:54.043000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:54.043000 [ 639 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom (cluster.py:147, run_and_check) 2025-11-13 08:32:54.043000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:54.043000 [ 639 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar (cluster.py:147, run_and_check) 2025-11-13 08:32:54.043000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:54.043000 [ 639 ] DEBUG : Stderr: module not found: io.unitycatalog#unitycatalog-spark_2.12;0.2.0 (cluster.py:147, run_and_check) 2025-11-13 08:32:54.044000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:54.044000 [ 639 ] DEBUG : Stderr: ==== local-m2-cache: tried (cluster.py:147, run_and_check) 2025-11-13 08:32:54.044000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:54.044000 [ 639 ] DEBUG : Stderr: file:/root/.m2/repository/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom (cluster.py:147, run_and_check) 2025-11-13 08:32:54.044000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:54.044000 [ 639 ] DEBUG : Stderr: -- artifact io.unitycatalog#unitycatalog-spark_2.12;0.2.0!unitycatalog-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-11-13 08:32:54.044000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:54.044000 [ 639 ] DEBUG : Stderr: file:/root/.m2/repository/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar (cluster.py:147, run_and_check) 2025-11-13 08:32:54.045000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:54.045000 [ 639 ] DEBUG : Stderr: ==== local-ivy-cache: tried (cluster.py:147, run_and_check) 2025-11-13 08:32:54.045000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:54.045000 [ 639 ] DEBUG : Stderr: /root/.ivy2/local/io.unitycatalog/unitycatalog-spark_2.12/0.2.0/ivys/ivy.xml (cluster.py:147, run_and_check) 2025-11-13 08:32:54.045000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:54.045000 [ 639 ] DEBUG : Stderr: -- artifact io.unitycatalog#unitycatalog-spark_2.12;0.2.0!unitycatalog-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-11-13 08:32:54.045000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:54.045000 [ 639 ] DEBUG : Stderr: /root/.ivy2/local/io.unitycatalog/unitycatalog-spark_2.12/0.2.0/jars/unitycatalog-spark_2.12.jar (cluster.py:147, run_and_check) 2025-11-13 08:32:54.046000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:54.046000 [ 639 ] DEBUG : Stderr: ==== central: tried (cluster.py:147, run_and_check) 2025-11-13 08:32:54.046000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:54.046000 [ 639 ] DEBUG : Stderr: https://repo1.maven.org/maven2/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom (cluster.py:147, run_and_check) 2025-11-13 08:32:54.046000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:54.046000 [ 639 ] DEBUG : Stderr: -- artifact io.unitycatalog#unitycatalog-spark_2.12;0.2.0!unitycatalog-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-11-13 08:32:54.046000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:54.046000 [ 639 ] DEBUG : Stderr: https://repo1.maven.org/maven2/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar (cluster.py:147, run_and_check) 2025-11-13 08:32:54.046000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:54.047000 [ 639 ] DEBUG : Stderr: ==== spark-packages: tried (cluster.py:147, run_and_check) 2025-11-13 08:32:54.047000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:54.047000 [ 639 ] DEBUG : Stderr: https://repos.spark-packages.org/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom (cluster.py:147, run_and_check) 2025-11-13 08:32:54.047000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:54.047000 [ 639 ] DEBUG : Stderr: -- artifact io.unitycatalog#unitycatalog-spark_2.12;0.2.0!unitycatalog-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-11-13 08:32:54.047000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:54.047000 [ 639 ] DEBUG : Stderr: https://repos.spark-packages.org/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar (cluster.py:147, run_and_check) 2025-11-13 08:32:54.047000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:54.047000 [ 639 ] DEBUG : Stderr: :::::::::::::::::::::::::::::::::::::::::::::: (cluster.py:147, run_and_check) 2025-11-13 08:32:54.047000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:54.047000 [ 639 ] DEBUG : Stderr: :: UNRESOLVED DEPENDENCIES :: (cluster.py:147, run_and_check) 2025-11-13 08:32:54.047000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:54.047000 [ 639 ] DEBUG : Stderr: :::::::::::::::::::::::::::::::::::::::::::::: (cluster.py:147, run_and_check) 2025-11-13 08:32:54.048000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:54.048000 [ 639 ] DEBUG : Stderr: :: org.apache.hadoop#hadoop-aws;3.3.4: not found (cluster.py:147, run_and_check) 2025-11-13 08:32:54.048000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:54.048000 [ 639 ] DEBUG : Stderr: :: io.delta#delta-spark_2.12;3.2.1: not found (cluster.py:147, run_and_check) 2025-11-13 08:32:54.048000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:54.048000 [ 639 ] DEBUG : Stderr: :: io.unitycatalog#unitycatalog-spark_2.12;0.2.0: not found (cluster.py:147, run_and_check) 2025-11-13 08:32:54.048000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:54.048000 [ 639 ] DEBUG : Stderr: :::::::::::::::::::::::::::::::::::::::::::::: (cluster.py:147, run_and_check) 2025-11-13 08:32:54.048000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:54.048000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:54.049000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:54.049000 [ 639 ] DEBUG : Stderr::: USE VERBOSE OR DEBUG MESSAGE LEVEL FOR MORE DETAILS (cluster.py:147, run_and_check) 2025-11-13 08:32:54.049000 [ 639 ] DEBUG : Stderr:Exception in thread "main" java.lang.RuntimeException: [unresolved dependency: org.apache.hadoop#hadoop-aws;3.3.4: not found, unresolved dependency: io.delta#delta-spark_2.12;3.2.1: not found, unresolved dependency: io.unitycatalog#unitycatalog-spark_2.12;0.2.0: not found] (cluster.py:147, run_and_check) 2025-11-13 08:32:54.049000 [ 639 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmitUtils$.resolveMavenCoordinates(SparkSubmit.scala:1613) (cluster.py:147, run_and_check) 2025-11-13 08:32:54.049000 [ 639 ] DEBUG : Stderr: at org.apache.spark.util.DependencyUtils$.resolveMavenDependencies(DependencyUtils.scala:185) (cluster.py:147, run_and_check) 2025-11-13 08:32:54.049000 [ 639 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.prepareSubmitEnvironment(SparkSubmit.scala:339) (cluster.py:147, run_and_check) 2025-11-13 08:32:54.049000 [ 639 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:969) (cluster.py:147, run_and_check) 2025-11-13 08:32:54.049000 [ 639 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:199) (cluster.py:147, run_and_check) 2025-11-13 08:32:54.050000 [ 639 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:222) (cluster.py:147, run_and_check) 2025-11-13 08:32:54.050000 [ 639 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:91) (cluster.py:147, run_and_check) 2025-11-13 08:32:54.050000 [ 639 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1125) (cluster.py:147, run_and_check) 2025-11-13 08:32:54.050000 [ 639 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1134) (cluster.py:147, run_and_check) 2025-11-13 08:32:54.050000 [ 639 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) (cluster.py:147, run_and_check) 2025-11-13 08:32:54.050000 [ 639 ] DEBUG : Exitcode:1 (cluster.py:149, run_and_check) 2025-11-13 08:32:54.050000 [ 639 ] DEBUG : run container_id:roottestdatabasedelta-gw2-node1-1 detach:False nothrow:True cmd: ['bash', '-c', '\ncd /spark-3.5.4-bin-hadoop3 && bin/spark-sql --name "s3-uc-test" \\\n --master "local[*]" \\\n --packages "org.apache.hadoop:hadoop-aws:3.3.4,io.delta:delta-spark_2.12:3.2.1,io.unitycatalog:unitycatalog-spark_2.12:0.2.0" \\\n --conf "spark.sql.extensions=io.delta.sql.DeltaSparkSessionExtension" \\\n --conf "spark.sql.catalog.spark_catalog=io.unitycatalog.spark.UCSingleCatalog" \\\n --conf "spark.hadoop.fs.s3.impl=org.apache.hadoop.fs.s3a.S3AFileSystem" \\\n --conf "spark.sql.catalog.unity=io.unitycatalog.spark.UCSingleCatalog" \\\n --conf "spark.sql.catalog.unity.uri=http://localhost:8080" \\\n --conf "spark.sql.catalog.unity.token=" \\\n --conf "spark.sql.defaultCatalog=unity" \\\n -S -e "CREATE TABLE test_schema0.test_table0 (col1 int, col2 double) using Delta location \'/tmp/test_schema0/test_table0\';CREATE TABLE test_schema1.test_table1 (col1 int, col2 double) using Delta location \'/tmp/test_schema1/test_table1\';CREATE TABLE test_schema2.test_table2 (col1 int, col2 double) using Delta location \'/tmp/test_schema2/test_table2\';CREATE TABLE test_schema3.test_table3 (col1 int, col2 double) using Delta location \'/tmp/test_schema3/test_table3\';CREATE TABLE test_schema4.test_table4 (col1 int, col2 double) using Delta location \'/tmp/test_schema4/test_table4\';CREATE TABLE test_schema5.test_table5 (col1 int, col2 double) using Delta location \'/tmp/test_schema5/test_table5\';CREATE TABLE test_schema6.test_table6 (col1 int, col2 double) using Delta location \'/tmp/test_schema6/test_table6\';CREATE TABLE test_schema7.test_table7 (col1 int, col2 double) using Delta location \'/tmp/test_schema7/test_table7\';CREATE TABLE test_schema8.test_table8 (col1 int, col2 double) using Delta location \'/tmp/test_schema8/test_table8\';CREATE TABLE test_schema9.test_table9 (col1 int, col2 double) using Delta location \'/tmp/test_schema9/test_table9\'" | grep -v \'loading settings\'\n'] (cluster.py:2051, exec_in_container) 2025-11-13 08:32:54.051000 [ 639 ] DEBUG : Command:[docker exec roottestdatabasedelta-gw2-node1-1 bash -c cd /spark-3.5.4-bin-hadoop3 && bin/spark-sql --name "s3-uc-test" \ --master "local[*]" \ --packages "org.apache.hadoop:hadoop-aws:3.3.4,io.delta:delta-spark_2.12:3.2.1,io.unitycatalog:unitycatalog-spark_2.12:0.2.0" \ --conf "spark.sql.extensions=io.delta.sql.DeltaSparkSessionExtension" \ --conf "spark.sql.catalog.spark_catalog=io.unitycatalog.spark.UCSingleCatalog" \ --conf "spark.hadoop.fs.s3.impl=org.apache.hadoop.fs.s3a.S3AFileSystem" \ --conf "spark.sql.catalog.unity=io.unitycatalog.spark.UCSingleCatalog" \ --conf "spark.sql.catalog.unity.uri=http://localhost:8080" \ --conf "spark.sql.catalog.unity.token=" \ --conf "spark.sql.defaultCatalog=unity" \ -S -e "CREATE TABLE test_schema0.test_table0 (col1 int, col2 double) using Delta location '/tmp/test_schema0/test_table0';CREATE TABLE test_schema1.test_table1 (col1 int, col2 double) using Delta location '/tmp/test_schema1/test_table1';CREATE TABLE test_schema2.test_table2 (col1 int, col2 double) using Delta location '/tmp/test_schema2/test_table2';CREATE TABLE test_schema3.test_table3 (col1 int, col2 double) using Delta location '/tmp/test_schema3/test_table3';CREATE TABLE test_schema4.test_table4 (col1 int, col2 double) using Delta location '/tmp/test_schema4/test_table4';CREATE TABLE test_schema5.test_table5 (col1 int, col2 double) using Delta location '/tmp/test_schema5/test_table5';CREATE TABLE test_schema6.test_table6 (col1 int, col2 double) using Delta location '/tmp/test_schema6/test_table6';CREATE TABLE test_schema7.test_table7 (col1 int, col2 double) using Delta location '/tmp/test_schema7/test_table7';CREATE TABLE test_schema8.test_table8 (col1 int, col2 double) using Delta location '/tmp/test_schema8/test_table8';CREATE TABLE test_schema9.test_table9 (col1 int, col2 double) using Delta location '/tmp/test_schema9/test_table9'" | grep -v 'loading settings' ] (cluster.py:121, run_and_check) 2025-11-13 08:32:59.770000 [ 639 ] DEBUG : Stderr:Ivy Default Cache set to: /root/.ivy2/cache (cluster.py:147, run_and_check) 2025-11-13 08:32:59.771000 [ 639 ] DEBUG : Stderr:The jars for the packages stored in: /root/.ivy2/jars (cluster.py:147, run_and_check) 2025-11-13 08:32:59.771000 [ 639 ] DEBUG : Stderr:org.apache.hadoop#hadoop-aws added as a dependency (cluster.py:147, run_and_check) 2025-11-13 08:32:59.771000 [ 639 ] DEBUG : Stderr:io.delta#delta-spark_2.12 added as a dependency (cluster.py:147, run_and_check) 2025-11-13 08:32:59.771000 [ 639 ] DEBUG : Stderr:io.unitycatalog#unitycatalog-spark_2.12 added as a dependency (cluster.py:147, run_and_check) 2025-11-13 08:32:59.771000 [ 639 ] DEBUG : Stderr::: resolving dependencies :: org.apache.spark#spark-submit-parent-4c897377-9532-4cd1-8ef6-74fcf17f5bb4;1.0 (cluster.py:147, run_and_check) 2025-11-13 08:32:59.771000 [ 639 ] DEBUG : Stderr: confs: [default] (cluster.py:147, run_and_check) 2025-11-13 08:32:59.771000 [ 639 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:32:59.771000 [ 639 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:32:59.771000 [ 639 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:32:59.772000 [ 639 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:32:59.772000 [ 639 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:32:59.772000 [ 639 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:32:59.772000 [ 639 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:32:59.772000 [ 639 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:32:59.772000 [ 639 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:32:59.772000 [ 639 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:32:59.772000 [ 639 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:32:59.772000 [ 639 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:32:59.772000 [ 639 ] DEBUG : Stderr::: resolution report :: resolve 4246ms :: artifacts dl 0ms (cluster.py:147, run_and_check) 2025-11-13 08:32:59.772000 [ 639 ] DEBUG : Stderr: :: modules in use: (cluster.py:147, run_and_check) 2025-11-13 08:32:59.772000 [ 639 ] DEBUG : Stderr: --------------------------------------------------------------------- (cluster.py:147, run_and_check) 2025-11-13 08:32:59.773000 [ 639 ] DEBUG : Stderr: | | modules || artifacts | (cluster.py:147, run_and_check) 2025-11-13 08:32:59.773000 [ 639 ] DEBUG : Stderr: | conf | number| search|dwnlded|evicted|| number|dwnlded| (cluster.py:147, run_and_check) 2025-11-13 08:32:59.773000 [ 639 ] DEBUG : Stderr: --------------------------------------------------------------------- (cluster.py:147, run_and_check) 2025-11-13 08:32:59.773000 [ 639 ] DEBUG : Stderr: | default | 3 | 0 | 0 | 0 || 0 | 0 | (cluster.py:147, run_and_check) 2025-11-13 08:32:59.773000 [ 639 ] DEBUG : Stderr: --------------------------------------------------------------------- (cluster.py:147, run_and_check) 2025-11-13 08:32:59.773000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:59.773000 [ 639 ] DEBUG : Stderr::: problems summary :: (cluster.py:147, run_and_check) 2025-11-13 08:32:59.773000 [ 639 ] DEBUG : Stderr::::: WARNINGS (cluster.py:147, run_and_check) 2025-11-13 08:32:59.773000 [ 639 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom (cluster.py:147, run_and_check) 2025-11-13 08:32:59.773000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:59.773000 [ 639 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar (cluster.py:147, run_and_check) 2025-11-13 08:32:59.773000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:59.773000 [ 639 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom (cluster.py:147, run_and_check) 2025-11-13 08:32:59.774000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:59.774000 [ 639 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar (cluster.py:147, run_and_check) 2025-11-13 08:32:59.774000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:59.774000 [ 639 ] DEBUG : Stderr: module not found: org.apache.hadoop#hadoop-aws;3.3.4 (cluster.py:147, run_and_check) 2025-11-13 08:32:59.774000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:59.774000 [ 639 ] DEBUG : Stderr: ==== local-m2-cache: tried (cluster.py:147, run_and_check) 2025-11-13 08:32:59.774000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:59.774000 [ 639 ] DEBUG : Stderr: file:/root/.m2/repository/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom (cluster.py:147, run_and_check) 2025-11-13 08:32:59.774000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:59.774000 [ 639 ] DEBUG : Stderr: -- artifact org.apache.hadoop#hadoop-aws;3.3.4!hadoop-aws.jar: (cluster.py:147, run_and_check) 2025-11-13 08:32:59.774000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:59.774000 [ 639 ] DEBUG : Stderr: file:/root/.m2/repository/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar (cluster.py:147, run_and_check) 2025-11-13 08:32:59.774000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:59.774000 [ 639 ] DEBUG : Stderr: ==== local-ivy-cache: tried (cluster.py:147, run_and_check) 2025-11-13 08:32:59.775000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:59.775000 [ 639 ] DEBUG : Stderr: /root/.ivy2/local/org.apache.hadoop/hadoop-aws/3.3.4/ivys/ivy.xml (cluster.py:147, run_and_check) 2025-11-13 08:32:59.775000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:59.775000 [ 639 ] DEBUG : Stderr: -- artifact org.apache.hadoop#hadoop-aws;3.3.4!hadoop-aws.jar: (cluster.py:147, run_and_check) 2025-11-13 08:32:59.775000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:59.775000 [ 639 ] DEBUG : Stderr: /root/.ivy2/local/org.apache.hadoop/hadoop-aws/3.3.4/jars/hadoop-aws.jar (cluster.py:147, run_and_check) 2025-11-13 08:32:59.775000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:59.775000 [ 639 ] DEBUG : Stderr: ==== central: tried (cluster.py:147, run_and_check) 2025-11-13 08:32:59.775000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:59.775000 [ 639 ] DEBUG : Stderr: https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom (cluster.py:147, run_and_check) 2025-11-13 08:32:59.775000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:59.775000 [ 639 ] DEBUG : Stderr: -- artifact org.apache.hadoop#hadoop-aws;3.3.4!hadoop-aws.jar: (cluster.py:147, run_and_check) 2025-11-13 08:32:59.775000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:59.775000 [ 639 ] DEBUG : Stderr: https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar (cluster.py:147, run_and_check) 2025-11-13 08:32:59.775000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:59.776000 [ 639 ] DEBUG : Stderr: ==== spark-packages: tried (cluster.py:147, run_and_check) 2025-11-13 08:32:59.776000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:59.776000 [ 639 ] DEBUG : Stderr: https://repos.spark-packages.org/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom (cluster.py:147, run_and_check) 2025-11-13 08:32:59.776000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:59.776000 [ 639 ] DEBUG : Stderr: -- artifact org.apache.hadoop#hadoop-aws;3.3.4!hadoop-aws.jar: (cluster.py:147, run_and_check) 2025-11-13 08:32:59.776000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:59.776000 [ 639 ] DEBUG : Stderr: https://repos.spark-packages.org/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar (cluster.py:147, run_and_check) 2025-11-13 08:32:59.776000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:59.776000 [ 639 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom (cluster.py:147, run_and_check) 2025-11-13 08:32:59.776000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:59.776000 [ 639 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar (cluster.py:147, run_and_check) 2025-11-13 08:32:59.776000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:59.776000 [ 639 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom (cluster.py:147, run_and_check) 2025-11-13 08:32:59.777000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:59.777000 [ 639 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar (cluster.py:147, run_and_check) 2025-11-13 08:32:59.777000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:59.777000 [ 639 ] DEBUG : Stderr: module not found: io.delta#delta-spark_2.12;3.2.1 (cluster.py:147, run_and_check) 2025-11-13 08:32:59.777000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:59.777000 [ 639 ] DEBUG : Stderr: ==== local-m2-cache: tried (cluster.py:147, run_and_check) 2025-11-13 08:32:59.777000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:59.777000 [ 639 ] DEBUG : Stderr: file:/root/.m2/repository/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom (cluster.py:147, run_and_check) 2025-11-13 08:32:59.777000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:59.777000 [ 639 ] DEBUG : Stderr: -- artifact io.delta#delta-spark_2.12;3.2.1!delta-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-11-13 08:32:59.778000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:59.778000 [ 639 ] DEBUG : Stderr: file:/root/.m2/repository/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar (cluster.py:147, run_and_check) 2025-11-13 08:32:59.778000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:59.778000 [ 639 ] DEBUG : Stderr: ==== local-ivy-cache: tried (cluster.py:147, run_and_check) 2025-11-13 08:32:59.778000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:59.778000 [ 639 ] DEBUG : Stderr: /root/.ivy2/local/io.delta/delta-spark_2.12/3.2.1/ivys/ivy.xml (cluster.py:147, run_and_check) 2025-11-13 08:32:59.778000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:59.778000 [ 639 ] DEBUG : Stderr: -- artifact io.delta#delta-spark_2.12;3.2.1!delta-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-11-13 08:32:59.778000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:59.779000 [ 639 ] DEBUG : Stderr: /root/.ivy2/local/io.delta/delta-spark_2.12/3.2.1/jars/delta-spark_2.12.jar (cluster.py:147, run_and_check) 2025-11-13 08:32:59.779000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:59.779000 [ 639 ] DEBUG : Stderr: ==== central: tried (cluster.py:147, run_and_check) 2025-11-13 08:32:59.779000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:59.779000 [ 639 ] DEBUG : Stderr: https://repo1.maven.org/maven2/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom (cluster.py:147, run_and_check) 2025-11-13 08:32:59.779000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:59.779000 [ 639 ] DEBUG : Stderr: -- artifact io.delta#delta-spark_2.12;3.2.1!delta-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-11-13 08:32:59.779000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:59.779000 [ 639 ] DEBUG : Stderr: https://repo1.maven.org/maven2/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar (cluster.py:147, run_and_check) 2025-11-13 08:32:59.779000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:59.780000 [ 639 ] DEBUG : Stderr: ==== spark-packages: tried (cluster.py:147, run_and_check) 2025-11-13 08:32:59.780000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:59.780000 [ 639 ] DEBUG : Stderr: https://repos.spark-packages.org/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom (cluster.py:147, run_and_check) 2025-11-13 08:32:59.780000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:59.780000 [ 639 ] DEBUG : Stderr: -- artifact io.delta#delta-spark_2.12;3.2.1!delta-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-11-13 08:32:59.780000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:59.780000 [ 639 ] DEBUG : Stderr: https://repos.spark-packages.org/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar (cluster.py:147, run_and_check) 2025-11-13 08:32:59.780000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:59.780000 [ 639 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom (cluster.py:147, run_and_check) 2025-11-13 08:32:59.780000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:59.780000 [ 639 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar (cluster.py:147, run_and_check) 2025-11-13 08:32:59.780000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:59.780000 [ 639 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom (cluster.py:147, run_and_check) 2025-11-13 08:32:59.781000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:59.781000 [ 639 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar (cluster.py:147, run_and_check) 2025-11-13 08:32:59.781000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:59.781000 [ 639 ] DEBUG : Stderr: module not found: io.unitycatalog#unitycatalog-spark_2.12;0.2.0 (cluster.py:147, run_and_check) 2025-11-13 08:32:59.781000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:59.781000 [ 639 ] DEBUG : Stderr: ==== local-m2-cache: tried (cluster.py:147, run_and_check) 2025-11-13 08:32:59.781000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:59.781000 [ 639 ] DEBUG : Stderr: file:/root/.m2/repository/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom (cluster.py:147, run_and_check) 2025-11-13 08:32:59.781000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:59.781000 [ 639 ] DEBUG : Stderr: -- artifact io.unitycatalog#unitycatalog-spark_2.12;0.2.0!unitycatalog-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-11-13 08:32:59.781000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:59.781000 [ 639 ] DEBUG : Stderr: file:/root/.m2/repository/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar (cluster.py:147, run_and_check) 2025-11-13 08:32:59.781000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:59.781000 [ 639 ] DEBUG : Stderr: ==== local-ivy-cache: tried (cluster.py:147, run_and_check) 2025-11-13 08:32:59.782000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:59.782000 [ 639 ] DEBUG : Stderr: /root/.ivy2/local/io.unitycatalog/unitycatalog-spark_2.12/0.2.0/ivys/ivy.xml (cluster.py:147, run_and_check) 2025-11-13 08:32:59.782000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:59.782000 [ 639 ] DEBUG : Stderr: -- artifact io.unitycatalog#unitycatalog-spark_2.12;0.2.0!unitycatalog-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-11-13 08:32:59.782000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:59.782000 [ 639 ] DEBUG : Stderr: /root/.ivy2/local/io.unitycatalog/unitycatalog-spark_2.12/0.2.0/jars/unitycatalog-spark_2.12.jar (cluster.py:147, run_and_check) 2025-11-13 08:32:59.782000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:59.782000 [ 639 ] DEBUG : Stderr: ==== central: tried (cluster.py:147, run_and_check) 2025-11-13 08:32:59.782000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:59.782000 [ 639 ] DEBUG : Stderr: https://repo1.maven.org/maven2/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom (cluster.py:147, run_and_check) 2025-11-13 08:32:59.783000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:59.783000 [ 639 ] DEBUG : Stderr: -- artifact io.unitycatalog#unitycatalog-spark_2.12;0.2.0!unitycatalog-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-11-13 08:32:59.783000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:59.783000 [ 639 ] DEBUG : Stderr: https://repo1.maven.org/maven2/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar (cluster.py:147, run_and_check) 2025-11-13 08:32:59.783000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:59.783000 [ 639 ] DEBUG : Stderr: ==== spark-packages: tried (cluster.py:147, run_and_check) 2025-11-13 08:32:59.783000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:59.783000 [ 639 ] DEBUG : Stderr: https://repos.spark-packages.org/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom (cluster.py:147, run_and_check) 2025-11-13 08:32:59.783000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:59.783000 [ 639 ] DEBUG : Stderr: -- artifact io.unitycatalog#unitycatalog-spark_2.12;0.2.0!unitycatalog-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-11-13 08:32:59.783000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:59.783000 [ 639 ] DEBUG : Stderr: https://repos.spark-packages.org/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar (cluster.py:147, run_and_check) 2025-11-13 08:32:59.783000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:59.783000 [ 639 ] DEBUG : Stderr: :::::::::::::::::::::::::::::::::::::::::::::: (cluster.py:147, run_and_check) 2025-11-13 08:32:59.784000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:59.784000 [ 639 ] DEBUG : Stderr: :: UNRESOLVED DEPENDENCIES :: (cluster.py:147, run_and_check) 2025-11-13 08:32:59.784000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:59.784000 [ 639 ] DEBUG : Stderr: :::::::::::::::::::::::::::::::::::::::::::::: (cluster.py:147, run_and_check) 2025-11-13 08:32:59.784000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:59.784000 [ 639 ] DEBUG : Stderr: :: org.apache.hadoop#hadoop-aws;3.3.4: not found (cluster.py:147, run_and_check) 2025-11-13 08:32:59.784000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:59.784000 [ 639 ] DEBUG : Stderr: :: io.delta#delta-spark_2.12;3.2.1: not found (cluster.py:147, run_and_check) 2025-11-13 08:32:59.784000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:59.784000 [ 639 ] DEBUG : Stderr: :: io.unitycatalog#unitycatalog-spark_2.12;0.2.0: not found (cluster.py:147, run_and_check) 2025-11-13 08:32:59.784000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:59.784000 [ 639 ] DEBUG : Stderr: :::::::::::::::::::::::::::::::::::::::::::::: (cluster.py:147, run_and_check) 2025-11-13 08:32:59.784000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:59.785000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:59.785000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:32:59.785000 [ 639 ] DEBUG : Stderr::: USE VERBOSE OR DEBUG MESSAGE LEVEL FOR MORE DETAILS (cluster.py:147, run_and_check) 2025-11-13 08:32:59.785000 [ 639 ] DEBUG : Stderr:Exception in thread "main" java.lang.RuntimeException: [unresolved dependency: org.apache.hadoop#hadoop-aws;3.3.4: not found, unresolved dependency: io.delta#delta-spark_2.12;3.2.1: not found, unresolved dependency: io.unitycatalog#unitycatalog-spark_2.12;0.2.0: not found] (cluster.py:147, run_and_check) 2025-11-13 08:32:59.785000 [ 639 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmitUtils$.resolveMavenCoordinates(SparkSubmit.scala:1613) (cluster.py:147, run_and_check) 2025-11-13 08:32:59.785000 [ 639 ] DEBUG : Stderr: at org.apache.spark.util.DependencyUtils$.resolveMavenDependencies(DependencyUtils.scala:185) (cluster.py:147, run_and_check) 2025-11-13 08:32:59.785000 [ 639 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.prepareSubmitEnvironment(SparkSubmit.scala:339) (cluster.py:147, run_and_check) 2025-11-13 08:32:59.785000 [ 639 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:969) (cluster.py:147, run_and_check) 2025-11-13 08:32:59.785000 [ 639 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:199) (cluster.py:147, run_and_check) 2025-11-13 08:32:59.785000 [ 639 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:222) (cluster.py:147, run_and_check) 2025-11-13 08:32:59.785000 [ 639 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:91) (cluster.py:147, run_and_check) 2025-11-13 08:32:59.785000 [ 639 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1125) (cluster.py:147, run_and_check) 2025-11-13 08:32:59.786000 [ 639 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1134) (cluster.py:147, run_and_check) 2025-11-13 08:32:59.786000 [ 639 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) (cluster.py:147, run_and_check) 2025-11-13 08:32:59.786000 [ 639 ] DEBUG : Exitcode:1 (cluster.py:149, run_and_check) 2025-11-13 08:32:59.786000 [ 639 ] DEBUG : run container_id:roottestdatabasedelta-gw2-node1-1 detach:False nothrow:True cmd: ['bash', '-c', '\ncd /spark-3.5.4-bin-hadoop3 && bin/spark-sql --name "s3-uc-test" \\\n --master "local[*]" \\\n --packages "org.apache.hadoop:hadoop-aws:3.3.4,io.delta:delta-spark_2.12:3.2.1,io.unitycatalog:unitycatalog-spark_2.12:0.2.0" \\\n --conf "spark.sql.extensions=io.delta.sql.DeltaSparkSessionExtension" \\\n --conf "spark.sql.catalog.spark_catalog=io.unitycatalog.spark.UCSingleCatalog" \\\n --conf "spark.hadoop.fs.s3.impl=org.apache.hadoop.fs.s3a.S3AFileSystem" \\\n --conf "spark.sql.catalog.unity=io.unitycatalog.spark.UCSingleCatalog" \\\n --conf "spark.sql.catalog.unity.uri=http://localhost:8080" \\\n --conf "spark.sql.catalog.unity.token=" \\\n --conf "spark.sql.defaultCatalog=unity" \\\n -S -e "INSERT INTO test_schema0.test_table0 VALUES (0, 0.0);INSERT INTO test_schema1.test_table1 VALUES (1, 1.0);INSERT INTO test_schema2.test_table2 VALUES (2, 2.0);INSERT INTO test_schema3.test_table3 VALUES (3, 3.0);INSERT INTO test_schema4.test_table4 VALUES (4, 4.0);INSERT INTO test_schema5.test_table5 VALUES (5, 5.0);INSERT INTO test_schema6.test_table6 VALUES (6, 6.0);INSERT INTO test_schema7.test_table7 VALUES (7, 7.0);INSERT INTO test_schema8.test_table8 VALUES (8, 8.0);INSERT INTO test_schema9.test_table9 VALUES (9, 9.0)" | grep -v \'loading settings\'\n'] (cluster.py:2051, exec_in_container) 2025-11-13 08:32:59.786000 [ 639 ] DEBUG : Command:[docker exec roottestdatabasedelta-gw2-node1-1 bash -c cd /spark-3.5.4-bin-hadoop3 && bin/spark-sql --name "s3-uc-test" \ --master "local[*]" \ --packages "org.apache.hadoop:hadoop-aws:3.3.4,io.delta:delta-spark_2.12:3.2.1,io.unitycatalog:unitycatalog-spark_2.12:0.2.0" \ --conf "spark.sql.extensions=io.delta.sql.DeltaSparkSessionExtension" \ --conf "spark.sql.catalog.spark_catalog=io.unitycatalog.spark.UCSingleCatalog" \ --conf "spark.hadoop.fs.s3.impl=org.apache.hadoop.fs.s3a.S3AFileSystem" \ --conf "spark.sql.catalog.unity=io.unitycatalog.spark.UCSingleCatalog" \ --conf "spark.sql.catalog.unity.uri=http://localhost:8080" \ --conf "spark.sql.catalog.unity.token=" \ --conf "spark.sql.defaultCatalog=unity" \ -S -e "INSERT INTO test_schema0.test_table0 VALUES (0, 0.0);INSERT INTO test_schema1.test_table1 VALUES (1, 1.0);INSERT INTO test_schema2.test_table2 VALUES (2, 2.0);INSERT INTO test_schema3.test_table3 VALUES (3, 3.0);INSERT INTO test_schema4.test_table4 VALUES (4, 4.0);INSERT INTO test_schema5.test_table5 VALUES (5, 5.0);INSERT INTO test_schema6.test_table6 VALUES (6, 6.0);INSERT INTO test_schema7.test_table7 VALUES (7, 7.0);INSERT INTO test_schema8.test_table8 VALUES (8, 8.0);INSERT INTO test_schema9.test_table9 VALUES (9, 9.0)" | grep -v 'loading settings' ] (cluster.py:121, run_and_check) 2025-11-13 08:33:05.536000 [ 639 ] DEBUG : Stderr:Ivy Default Cache set to: /root/.ivy2/cache (cluster.py:147, run_and_check) 2025-11-13 08:33:05.536000 [ 639 ] DEBUG : Stderr:The jars for the packages stored in: /root/.ivy2/jars (cluster.py:147, run_and_check) 2025-11-13 08:33:05.536000 [ 639 ] DEBUG : Stderr:org.apache.hadoop#hadoop-aws added as a dependency (cluster.py:147, run_and_check) 2025-11-13 08:33:05.536000 [ 639 ] DEBUG : Stderr:io.delta#delta-spark_2.12 added as a dependency (cluster.py:147, run_and_check) 2025-11-13 08:33:05.536000 [ 639 ] DEBUG : Stderr:io.unitycatalog#unitycatalog-spark_2.12 added as a dependency (cluster.py:147, run_and_check) 2025-11-13 08:33:05.536000 [ 639 ] DEBUG : Stderr::: resolving dependencies :: org.apache.spark#spark-submit-parent-2f3e7458-da74-45a1-9291-108cf2832e0e;1.0 (cluster.py:147, run_and_check) 2025-11-13 08:33:05.536000 [ 639 ] DEBUG : Stderr: confs: [default] (cluster.py:147, run_and_check) 2025-11-13 08:33:05.536000 [ 639 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:33:05.536000 [ 639 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:33:05.537000 [ 639 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:33:05.537000 [ 639 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:33:05.537000 [ 639 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:33:05.537000 [ 639 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:33:05.537000 [ 639 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:33:05.537000 [ 639 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:33:05.537000 [ 639 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:33:05.537000 [ 639 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:33:05.537000 [ 639 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:33:05.537000 [ 639 ] DEBUG : Stderr:You probably access the destination server through a proxy server that is not well configured. (cluster.py:147, run_and_check) 2025-11-13 08:33:05.537000 [ 639 ] DEBUG : Stderr::: resolution report :: resolve 4293ms :: artifacts dl 1ms (cluster.py:147, run_and_check) 2025-11-13 08:33:05.537000 [ 639 ] DEBUG : Stderr: :: modules in use: (cluster.py:147, run_and_check) 2025-11-13 08:33:05.537000 [ 639 ] DEBUG : Stderr: --------------------------------------------------------------------- (cluster.py:147, run_and_check) 2025-11-13 08:33:05.538000 [ 639 ] DEBUG : Stderr: | | modules || artifacts | (cluster.py:147, run_and_check) 2025-11-13 08:33:05.538000 [ 639 ] DEBUG : Stderr: | conf | number| search|dwnlded|evicted|| number|dwnlded| (cluster.py:147, run_and_check) 2025-11-13 08:33:05.538000 [ 639 ] DEBUG : Stderr: --------------------------------------------------------------------- (cluster.py:147, run_and_check) 2025-11-13 08:33:05.538000 [ 639 ] DEBUG : Stderr: | default | 3 | 0 | 0 | 0 || 0 | 0 | (cluster.py:147, run_and_check) 2025-11-13 08:33:05.538000 [ 639 ] DEBUG : Stderr: --------------------------------------------------------------------- (cluster.py:147, run_and_check) 2025-11-13 08:33:05.538000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:05.538000 [ 639 ] DEBUG : Stderr::: problems summary :: (cluster.py:147, run_and_check) 2025-11-13 08:33:05.538000 [ 639 ] DEBUG : Stderr::::: WARNINGS (cluster.py:147, run_and_check) 2025-11-13 08:33:05.538000 [ 639 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom (cluster.py:147, run_and_check) 2025-11-13 08:33:05.538000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:05.538000 [ 639 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar (cluster.py:147, run_and_check) 2025-11-13 08:33:05.538000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:05.538000 [ 639 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom (cluster.py:147, run_and_check) 2025-11-13 08:33:05.539000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:05.539000 [ 639 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar (cluster.py:147, run_and_check) 2025-11-13 08:33:05.539000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:05.539000 [ 639 ] DEBUG : Stderr: module not found: org.apache.hadoop#hadoop-aws;3.3.4 (cluster.py:147, run_and_check) 2025-11-13 08:33:05.539000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:05.539000 [ 639 ] DEBUG : Stderr: ==== local-m2-cache: tried (cluster.py:147, run_and_check) 2025-11-13 08:33:05.539000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:05.539000 [ 639 ] DEBUG : Stderr: file:/root/.m2/repository/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom (cluster.py:147, run_and_check) 2025-11-13 08:33:05.539000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:05.539000 [ 639 ] DEBUG : Stderr: -- artifact org.apache.hadoop#hadoop-aws;3.3.4!hadoop-aws.jar: (cluster.py:147, run_and_check) 2025-11-13 08:33:05.539000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:05.539000 [ 639 ] DEBUG : Stderr: file:/root/.m2/repository/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar (cluster.py:147, run_and_check) 2025-11-13 08:33:05.539000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:05.540000 [ 639 ] DEBUG : Stderr: ==== local-ivy-cache: tried (cluster.py:147, run_and_check) 2025-11-13 08:33:05.540000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:05.540000 [ 639 ] DEBUG : Stderr: /root/.ivy2/local/org.apache.hadoop/hadoop-aws/3.3.4/ivys/ivy.xml (cluster.py:147, run_and_check) 2025-11-13 08:33:05.540000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:05.540000 [ 639 ] DEBUG : Stderr: -- artifact org.apache.hadoop#hadoop-aws;3.3.4!hadoop-aws.jar: (cluster.py:147, run_and_check) 2025-11-13 08:33:05.540000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:05.540000 [ 639 ] DEBUG : Stderr: /root/.ivy2/local/org.apache.hadoop/hadoop-aws/3.3.4/jars/hadoop-aws.jar (cluster.py:147, run_and_check) 2025-11-13 08:33:05.540000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:05.540000 [ 639 ] DEBUG : Stderr: ==== central: tried (cluster.py:147, run_and_check) 2025-11-13 08:33:05.540000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:05.540000 [ 639 ] DEBUG : Stderr: https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom (cluster.py:147, run_and_check) 2025-11-13 08:33:05.541000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:05.541000 [ 639 ] DEBUG : Stderr: -- artifact org.apache.hadoop#hadoop-aws;3.3.4!hadoop-aws.jar: (cluster.py:147, run_and_check) 2025-11-13 08:33:05.541000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:05.541000 [ 639 ] DEBUG : Stderr: https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar (cluster.py:147, run_and_check) 2025-11-13 08:33:05.541000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:05.541000 [ 639 ] DEBUG : Stderr: ==== spark-packages: tried (cluster.py:147, run_and_check) 2025-11-13 08:33:05.541000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:05.541000 [ 639 ] DEBUG : Stderr: https://repos.spark-packages.org/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.pom (cluster.py:147, run_and_check) 2025-11-13 08:33:05.541000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:05.541000 [ 639 ] DEBUG : Stderr: -- artifact org.apache.hadoop#hadoop-aws;3.3.4!hadoop-aws.jar: (cluster.py:147, run_and_check) 2025-11-13 08:33:05.541000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:05.541000 [ 639 ] DEBUG : Stderr: https://repos.spark-packages.org/org/apache/hadoop/hadoop-aws/3.3.4/hadoop-aws-3.3.4.jar (cluster.py:147, run_and_check) 2025-11-13 08:33:05.541000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:05.542000 [ 639 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom (cluster.py:147, run_and_check) 2025-11-13 08:33:05.542000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:05.542000 [ 639 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar (cluster.py:147, run_and_check) 2025-11-13 08:33:05.542000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:05.542000 [ 639 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom (cluster.py:147, run_and_check) 2025-11-13 08:33:05.542000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:05.542000 [ 639 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar (cluster.py:147, run_and_check) 2025-11-13 08:33:05.542000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:05.542000 [ 639 ] DEBUG : Stderr: module not found: io.delta#delta-spark_2.12;3.2.1 (cluster.py:147, run_and_check) 2025-11-13 08:33:05.542000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:05.542000 [ 639 ] DEBUG : Stderr: ==== local-m2-cache: tried (cluster.py:147, run_and_check) 2025-11-13 08:33:05.542000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:05.542000 [ 639 ] DEBUG : Stderr: file:/root/.m2/repository/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom (cluster.py:147, run_and_check) 2025-11-13 08:33:05.543000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:05.543000 [ 639 ] DEBUG : Stderr: -- artifact io.delta#delta-spark_2.12;3.2.1!delta-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-11-13 08:33:05.543000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:05.543000 [ 639 ] DEBUG : Stderr: file:/root/.m2/repository/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar (cluster.py:147, run_and_check) 2025-11-13 08:33:05.543000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:05.543000 [ 639 ] DEBUG : Stderr: ==== local-ivy-cache: tried (cluster.py:147, run_and_check) 2025-11-13 08:33:05.543000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:05.543000 [ 639 ] DEBUG : Stderr: /root/.ivy2/local/io.delta/delta-spark_2.12/3.2.1/ivys/ivy.xml (cluster.py:147, run_and_check) 2025-11-13 08:33:05.543000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:05.543000 [ 639 ] DEBUG : Stderr: -- artifact io.delta#delta-spark_2.12;3.2.1!delta-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-11-13 08:33:05.543000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:05.543000 [ 639 ] DEBUG : Stderr: /root/.ivy2/local/io.delta/delta-spark_2.12/3.2.1/jars/delta-spark_2.12.jar (cluster.py:147, run_and_check) 2025-11-13 08:33:05.543000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:05.543000 [ 639 ] DEBUG : Stderr: ==== central: tried (cluster.py:147, run_and_check) 2025-11-13 08:33:05.544000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:05.544000 [ 639 ] DEBUG : Stderr: https://repo1.maven.org/maven2/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom (cluster.py:147, run_and_check) 2025-11-13 08:33:05.544000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:05.544000 [ 639 ] DEBUG : Stderr: -- artifact io.delta#delta-spark_2.12;3.2.1!delta-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-11-13 08:33:05.544000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:05.544000 [ 639 ] DEBUG : Stderr: https://repo1.maven.org/maven2/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar (cluster.py:147, run_and_check) 2025-11-13 08:33:05.544000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:05.544000 [ 639 ] DEBUG : Stderr: ==== spark-packages: tried (cluster.py:147, run_and_check) 2025-11-13 08:33:05.544000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:05.544000 [ 639 ] DEBUG : Stderr: https://repos.spark-packages.org/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.pom (cluster.py:147, run_and_check) 2025-11-13 08:33:05.544000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:05.544000 [ 639 ] DEBUG : Stderr: -- artifact io.delta#delta-spark_2.12;3.2.1!delta-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-11-13 08:33:05.544000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:05.545000 [ 639 ] DEBUG : Stderr: https://repos.spark-packages.org/io/delta/delta-spark_2.12/3.2.1/delta-spark_2.12-3.2.1.jar (cluster.py:147, run_and_check) 2025-11-13 08:33:05.545000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:05.545000 [ 639 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom (cluster.py:147, run_and_check) 2025-11-13 08:33:05.545000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:05.545000 [ 639 ] DEBUG : Stderr: Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar (cluster.py:147, run_and_check) 2025-11-13 08:33:05.545000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:05.545000 [ 639 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom (cluster.py:147, run_and_check) 2025-11-13 08:33:05.545000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:05.545000 [ 639 ] DEBUG : Stderr: Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar (cluster.py:147, run_and_check) 2025-11-13 08:33:05.545000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:05.545000 [ 639 ] DEBUG : Stderr: module not found: io.unitycatalog#unitycatalog-spark_2.12;0.2.0 (cluster.py:147, run_and_check) 2025-11-13 08:33:05.545000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:05.545000 [ 639 ] DEBUG : Stderr: ==== local-m2-cache: tried (cluster.py:147, run_and_check) 2025-11-13 08:33:05.545000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:05.546000 [ 639 ] DEBUG : Stderr: file:/root/.m2/repository/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom (cluster.py:147, run_and_check) 2025-11-13 08:33:05.546000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:05.546000 [ 639 ] DEBUG : Stderr: -- artifact io.unitycatalog#unitycatalog-spark_2.12;0.2.0!unitycatalog-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-11-13 08:33:05.546000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:05.546000 [ 639 ] DEBUG : Stderr: file:/root/.m2/repository/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar (cluster.py:147, run_and_check) 2025-11-13 08:33:05.546000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:05.546000 [ 639 ] DEBUG : Stderr: ==== local-ivy-cache: tried (cluster.py:147, run_and_check) 2025-11-13 08:33:05.546000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:05.546000 [ 639 ] DEBUG : Stderr: /root/.ivy2/local/io.unitycatalog/unitycatalog-spark_2.12/0.2.0/ivys/ivy.xml (cluster.py:147, run_and_check) 2025-11-13 08:33:05.546000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:05.546000 [ 639 ] DEBUG : Stderr: -- artifact io.unitycatalog#unitycatalog-spark_2.12;0.2.0!unitycatalog-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-11-13 08:33:05.546000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:05.546000 [ 639 ] DEBUG : Stderr: /root/.ivy2/local/io.unitycatalog/unitycatalog-spark_2.12/0.2.0/jars/unitycatalog-spark_2.12.jar (cluster.py:147, run_and_check) 2025-11-13 08:33:05.547000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:05.547000 [ 639 ] DEBUG : Stderr: ==== central: tried (cluster.py:147, run_and_check) 2025-11-13 08:33:05.547000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:05.547000 [ 639 ] DEBUG : Stderr: https://repo1.maven.org/maven2/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom (cluster.py:147, run_and_check) 2025-11-13 08:33:05.547000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:05.547000 [ 639 ] DEBUG : Stderr: -- artifact io.unitycatalog#unitycatalog-spark_2.12;0.2.0!unitycatalog-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-11-13 08:33:05.547000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:05.547000 [ 639 ] DEBUG : Stderr: https://repo1.maven.org/maven2/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar (cluster.py:147, run_and_check) 2025-11-13 08:33:05.547000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:05.547000 [ 639 ] DEBUG : Stderr: ==== spark-packages: tried (cluster.py:147, run_and_check) 2025-11-13 08:33:05.547000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:05.547000 [ 639 ] DEBUG : Stderr: https://repos.spark-packages.org/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.pom (cluster.py:147, run_and_check) 2025-11-13 08:33:05.547000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:05.547000 [ 639 ] DEBUG : Stderr: -- artifact io.unitycatalog#unitycatalog-spark_2.12;0.2.0!unitycatalog-spark_2.12.jar: (cluster.py:147, run_and_check) 2025-11-13 08:33:05.548000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:05.548000 [ 639 ] DEBUG : Stderr: https://repos.spark-packages.org/io/unitycatalog/unitycatalog-spark_2.12/0.2.0/unitycatalog-spark_2.12-0.2.0.jar (cluster.py:147, run_and_check) 2025-11-13 08:33:05.548000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:05.548000 [ 639 ] DEBUG : Stderr: :::::::::::::::::::::::::::::::::::::::::::::: (cluster.py:147, run_and_check) 2025-11-13 08:33:05.548000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:05.548000 [ 639 ] DEBUG : Stderr: :: UNRESOLVED DEPENDENCIES :: (cluster.py:147, run_and_check) 2025-11-13 08:33:05.548000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:05.548000 [ 639 ] DEBUG : Stderr: :::::::::::::::::::::::::::::::::::::::::::::: (cluster.py:147, run_and_check) 2025-11-13 08:33:05.548000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:05.548000 [ 639 ] DEBUG : Stderr: :: org.apache.hadoop#hadoop-aws;3.3.4: not found (cluster.py:147, run_and_check) 2025-11-13 08:33:05.548000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:05.548000 [ 639 ] DEBUG : Stderr: :: io.delta#delta-spark_2.12;3.2.1: not found (cluster.py:147, run_and_check) 2025-11-13 08:33:05.549000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:05.549000 [ 639 ] DEBUG : Stderr: :: io.unitycatalog#unitycatalog-spark_2.12;0.2.0: not found (cluster.py:147, run_and_check) 2025-11-13 08:33:05.549000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:05.549000 [ 639 ] DEBUG : Stderr: :::::::::::::::::::::::::::::::::::::::::::::: (cluster.py:147, run_and_check) 2025-11-13 08:33:05.549000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:05.549000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:05.549000 [ 639 ] DEBUG : Stderr: (cluster.py:147, run_and_check) 2025-11-13 08:33:05.549000 [ 639 ] DEBUG : Stderr::: USE VERBOSE OR DEBUG MESSAGE LEVEL FOR MORE DETAILS (cluster.py:147, run_and_check) 2025-11-13 08:33:05.549000 [ 639 ] DEBUG : Stderr:Exception in thread "main" java.lang.RuntimeException: [unresolved dependency: org.apache.hadoop#hadoop-aws;3.3.4: not found, unresolved dependency: io.delta#delta-spark_2.12;3.2.1: not found, unresolved dependency: io.unitycatalog#unitycatalog-spark_2.12;0.2.0: not found] (cluster.py:147, run_and_check) 2025-11-13 08:33:05.549000 [ 639 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmitUtils$.resolveMavenCoordinates(SparkSubmit.scala:1613) (cluster.py:147, run_and_check) 2025-11-13 08:33:05.549000 [ 639 ] DEBUG : Stderr: at org.apache.spark.util.DependencyUtils$.resolveMavenDependencies(DependencyUtils.scala:185) (cluster.py:147, run_and_check) 2025-11-13 08:33:05.549000 [ 639 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.prepareSubmitEnvironment(SparkSubmit.scala:339) (cluster.py:147, run_and_check) 2025-11-13 08:33:05.549000 [ 639 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:969) (cluster.py:147, run_and_check) 2025-11-13 08:33:05.550000 [ 639 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:199) (cluster.py:147, run_and_check) 2025-11-13 08:33:05.550000 [ 639 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:222) (cluster.py:147, run_and_check) 2025-11-13 08:33:05.550000 [ 639 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:91) (cluster.py:147, run_and_check) 2025-11-13 08:33:05.550000 [ 639 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1125) (cluster.py:147, run_and_check) 2025-11-13 08:33:05.550000 [ 639 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1134) (cluster.py:147, run_and_check) 2025-11-13 08:33:05.550000 [ 639 ] DEBUG : Stderr: at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) (cluster.py:147, run_and_check) 2025-11-13 08:33:05.550000 [ 639 ] DEBUG : Exitcode:1 (cluster.py:149, run_and_check) 2025-11-13 08:33:05.550000 [ 639 ] DEBUG : Executing query create database multi_schema_test engine DataLakeCatalog('http://localhost:8080/api/2.1/unity-catalog') settings warehouse = 'unity', catalog_type='unity', vended_credentials=false on node1 (cluster.py:3648, query) 2025-11-13 08:33:05.816000 [ 639 ] DEBUG : Executing query SHOW TABLES FROM multi_schema_test LIKE 'test_schema%' on node1 (cluster.py:3648, query) 2025-11-13 08:33:06.333000 [ 639 ] DEBUG : Executing query SELECT col1 FROM multi_schema_test.`` on node1 (cluster.py:3648, query) ---------------------------- Captured log teardown ----------------------------- 2025-11-13 08:33:07.198000 [ 639 ] DEBUG : Command:[docker compose --env-file /ClickHouse/tests/integration/test_database_delta/_instances-1-gw2/.env --project-name roottestdatabasedelta-gw2 --file /ClickHouse/tests/integration/test_database_delta/_instances-1-gw2/node1/docker-compose.yml stop --timeout 20] (cluster.py:121, run_and_check) 2025-11-13 08:33:12.476000 [ 639 ] DEBUG : Stderr: Container roottestdatabasedelta-gw2-node1-1 Stopping (cluster.py:147, run_and_check) 2025-11-13 08:33:12.476000 [ 639 ] DEBUG : Stderr: Container roottestdatabasedelta-gw2-node1-1 Stopped (cluster.py:147, run_and_check) 2025-11-13 08:33:12.476000 [ 639 ] DEBUG : Command:[bash -c [ -f /ClickHouse/tests/integration/test_database_delta/_instances-1-gw2/node1/logs/stderr.log ] && zgrep -aH "==================" /ClickHouse/tests/integration/test_database_delta/_instances-1-gw2/node1/logs/stderr.log* | ( [ -z "" ] && cat || grep -v "$" ) || true] (cluster.py:121, run_and_check) 2025-11-13 08:33:12.489000 [ 639 ] DEBUG : Command:[docker compose --env-file /ClickHouse/tests/integration/test_database_delta/_instances-1-gw2/.env --project-name roottestdatabasedelta-gw2 --file /ClickHouse/tests/integration/test_database_delta/_instances-1-gw2/node1/docker-compose.yml down --volumes] (cluster.py:121, run_and_check) 2025-11-13 08:33:12.949000 [ 639 ] DEBUG : Stderr: Container roottestdatabasedelta-gw2-node1-1 Stopping (cluster.py:147, run_and_check) 2025-11-13 08:33:12.949000 [ 639 ] DEBUG : Stderr: Container roottestdatabasedelta-gw2-node1-1 Stopped (cluster.py:147, run_and_check) 2025-11-13 08:33:12.949000 [ 639 ] DEBUG : Stderr: Container roottestdatabasedelta-gw2-node1-1 Removing (cluster.py:147, run_and_check) 2025-11-13 08:33:12.949000 [ 639 ] DEBUG : Stderr: Container roottestdatabasedelta-gw2-node1-1 Removed (cluster.py:147, run_and_check) 2025-11-13 08:33:12.949000 [ 639 ] DEBUG : Stderr: Network roottestdatabasedelta-gw2_default Removing (cluster.py:147, run_and_check) 2025-11-13 08:33:12.950000 [ 639 ] DEBUG : Stderr: Network roottestdatabasedelta-gw2_default Removed (cluster.py:147, run_and_check) 2025-11-13 08:33:12.950000 [ 639 ] DEBUG : Cleanup called (cluster.py:851, cleanup) 2025-11-13 08:33:12.972000 [ 639 ] DEBUG : Docker networks for project roottestdatabasedelta-gw2 are NETWORK ID NAME DRIVER SCOPE (cluster.py:830, print_all_docker_pieces) 2025-11-13 08:33:12.993000 [ 639 ] DEBUG : Docker containers for project roottestdatabasedelta-gw2 are CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES (cluster.py:838, print_all_docker_pieces) 2025-11-13 08:33:13.017000 [ 639 ] DEBUG : Docker volumes for project roottestdatabasedelta-gw2 are DRIVER VOLUME NAME (cluster.py:846, print_all_docker_pieces) 2025-11-13 08:33:13.018000 [ 639 ] DEBUG : Command:[docker container list --all --filter name='^/roottestdatabasedelta-gw2-.*-1$' --format '{{.ID}}:{{.Names}}'] (cluster.py:121, run_and_check) 2025-11-13 08:33:13.040000 [ 639 ] DEBUG : Unstopped containers: {} (cluster.py:865, cleanup) 2025-11-13 08:33:13.040000 [ 639 ] DEBUG : No running containers for project: roottestdatabasedelta-gw2 (cluster.py:879, cleanup) 2025-11-13 08:33:13.040000 [ 639 ] DEBUG : Trying to prune unused networks... (cluster.py:885, cleanup) 2025-11-13 08:33:13.058000 [ 639 ] DEBUG : Trying to prune unused images... (cluster.py:901, cleanup) 2025-11-13 08:33:13.058000 [ 639 ] DEBUG : Command:[docker image prune -f] (cluster.py:121, run_and_check) 2025-11-13 08:33:13.091000 [ 639 ] DEBUG : Stdout:Total reclaimed space: 0B (cluster.py:145, run_and_check) 2025-11-13 08:33:13.091000 [ 639 ] DEBUG : Images pruned (cluster.py:904, cleanup) 2025-11-13 08:33:13.091000 [ 639 ] DEBUG : Trying to prune unused volumes... (cluster.py:910, cleanup) 2025-11-13 08:33:13.091000 [ 639 ] DEBUG : Command:[docker volume ls | wc -l] (cluster.py:121, run_and_check) 2025-11-13 08:33:13.117000 [ 639 ] DEBUG : Stdout:1 (cluster.py:145, run_and_check) 2025-11-13 08:33:13.117000 [ 639 ] DEBUG : Volumes pruned: 1 (cluster.py:915, cleanup) ----------------- generated report log file: parallel0_1.jsonl ----------------- ============================== slowest durations =============================== 19.42s call test_database_delta/test.py::test_complex_table_schema 18.70s call test_database_delta/test.py::test_multiple_schemes_tables 18.15s setup test_backup_restore_on_cluster/test_different_versions.py::test_different_versions 15.72s setup test_database_delta/test.py::test_complex_table_schema 13.73s setup test_asynchronous_metric_jemalloc_profile_active/test.py::test_asynchronous_metric_jemalloc_profile_active 8.63s teardown test_backup_restore_on_cluster/test_different_versions.py::test_different_versions 7.44s call test_database_delta/test.py::test_embedded_database_and_tables 5.92s teardown test_database_delta/test.py::test_multiple_schemes_tables 4.12s call test_backup_restore_on_cluster/test_different_versions.py::test_different_versions 2.92s teardown test_asynchronous_metric_jemalloc_profile_active/test.py::test_asynchronous_metric_jemalloc_profile_active 0.27s call test_asynchronous_metric_jemalloc_profile_active/test.py::test_asynchronous_metric_jemalloc_profile_active 0.00s setup test_database_delta/test.py::test_embedded_database_and_tables 0.00s teardown test_database_delta/test.py::test_complex_table_schema 0.00s teardown test_database_delta/test.py::test_embedded_database_and_tables 0.00s setup test_database_delta/test.py::test_multiple_schemes_tables =========================== short test summary info ============================ FAILED test_backup_restore_on_cluster/test_different_versions.py::test_different_versions FAILED test_database_delta/test.py::test_complex_table_schema - helpers.clien... FAILED test_database_delta/test.py::test_embedded_database_and_tables - Excep... FAILED test_database_delta/test.py::test_multiple_schemes_tables - helpers.cl... SKIPPED [1] test_asynchronous_metric_jemalloc_profile_active/test.py:30: Disabled for sanitizers =================== 4 failed, 1 skipped in 69.47s (0:01:09) ==================== Traceback (most recent call last): File "/home/ubuntu/_work/ClickHouse/ClickHouse/tests/integration/./runner", line 492, in subprocess.check_call(cmd, shell=True, bufsize=0) File "/usr/lib/python3.10/subprocess.py", line 369, in check_call raise CalledProcessError(retcode, cmd) subprocess.CalledProcessError: Command 'docker run --rm --name clickhouse_integration_tests_4t2s3x --privileged --dns-search='.' --memory=30709026816 --security-opt seccomp=unconfined --cap-add=SYS_PTRACE --volume=/home/ubuntu/_work/_temp/test/build/clickhouse:/clickhouse --volume=/home/ubuntu/_work/ClickHouse/ClickHouse/programs/server:/clickhouse-config --volume=/home/ubuntu/_work/ClickHouse/ClickHouse/tests/integration:/ClickHouse/tests/integration --volume=/home/ubuntu/_work/ClickHouse/ClickHouse/utils/backupview:/ClickHouse/utils/backupview --volume=/home/ubuntu/_work/ClickHouse/ClickHouse/utils/grpc-client/pb2:/ClickHouse/utils/grpc-client/pb2 --volume=/run:/run/host:ro --volume=clickhouse_integration_tests_volume:/var/lib/docker -e DOCKER_DOTNET_CLIENT_TAG=11de0b29a15d -e DOCKER_HELPER_TAG=5dc43a6382f0 -e DOCKER_BASE_TAG=5ccda723c1fc -e DOCKER_KERBEROS_KDC_TAG=9391ecdee8d7 -e DOCKER_MYSQL_GOLANG_CLIENT_TAG=9bec2a638e6e -e DOCKER_MYSQL_JAVA_CLIENT_TAG=766bff31cfe4 -e DOCKER_MYSQL_JS_CLIENT_TAG=41ba7c2ec2a1 -e DOCKER_MYSQL_PHP_CLIENT_TAG=88be89c1e3b6 -e DOCKER_NGINX_DAV_TAG=b55ac9cd7519 -e DOCKER_POSTGRESQL_JAVA_CLIENT_TAG=a4eff5c7f4d6 -e DOCKER_PYTHON_BOTTLE_TAG=d862517635bf -e DOCKER_CLIENT_TIMEOUT=300 -e COMPOSE_HTTP_TIMEOUT=600 -e PYTHONUNBUFFERED=1 -e PYTEST_ADDOPTS="--dist=loadfile -n 10 -rfEps --run-id=1 --color=no --durations=0 --report-log=parallel0_1.jsonl --report-log-exclude-logs-on-passed-tests test_asynchronous_metric_jemalloc_profile_active/test.py::test_asynchronous_metric_jemalloc_profile_active test_backup_restore_on_cluster/test_different_versions.py::test_different_versions test_database_delta/test.py::test_complex_table_schema test_database_delta/test.py::test_embedded_database_and_tables test_database_delta/test.py::test_multiple_schemes_tables -vvv " altinityinfra/integration-tests-runner:226bfaf75ac1 ' returned non-zero exit status 1.